Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
14,902
| 18,291,722,999
|
IssuesEvent
|
2021-10-05 15:55:01
|
GSA/EDX
|
https://api.github.com/repos/GSA/EDX
|
reopened
|
Processing Separations Reports
|
process touchpoints
|
A Separation Report is sent from IT every month, containing a list of people who have left.
If a person is listed in the Separations Report and assigned to a Website as Website Manager, what should happen?
### User Story
As a Digital Council member
I want to know which Websites do not have managers (recently, due to Separations, for example)
So that I can establish active stewardship of a website.
### Acceptance Criteria
Given a separations report is received by Touchpoints PMO
When a user is inactivated AND that user is a Website Manager or Website Contact
Then the Touchpoints PMO can easily share that information with the Digital Council and EDX team for support.
|
1.0
|
Processing Separations Reports - A Separation Report is sent from IT every month, containing a list of people who have left.
If a person is listed in the Separations Report and assigned to a Website as Website Manager, what should happen?
### User Story
As a Digital Council member
I want to know which Websites do not have managers (recently, due to Separations, for example)
So that I can establish active stewardship of a website.
### Acceptance Criteria
Given a separations report is received by Touchpoints PMO
When a user is inactivated AND that user is a Website Manager or Website Contact
Then the Touchpoints PMO can easily share that information with the Digital Council and EDX team for support.
|
process
|
processing separations reports a separation report is sent from it every month containing a list of people who have left if a person is listed in the separations report and assigned to a website as website manager what should happen user story as a digital council member i want to know which websites do not have managers recently due to separations for example so that i can establish active stewardship of a website acceptance criteria given a separations report is received by touchpoints pmo when a user is inactivated and that user is a website manager or website contact then the touchpoints pmo can easily share that information with the digital council and edx team for support
| 1
|
22,146
| 15,020,513,685
|
IssuesEvent
|
2021-02-01 14:48:00
|
ansible/galaxy_ng
|
https://api.github.com/repos/ansible/galaxy_ng
|
closed
|
Add django-admin models that support pulpcore RBAC via django-guardian
|
area/infrastructure priority/low status/in-progress type/enhancement
|
Add ModelAdmin objects for galaxy_ng
Use pulpcore.admin.BaseModel for guardian obj perms
Subtask of #302
|
1.0
|
Add django-admin models that support pulpcore RBAC via django-guardian - Add ModelAdmin objects for galaxy_ng
Use pulpcore.admin.BaseModel for guardian obj perms
Subtask of #302
|
non_process
|
add django admin models that support pulpcore rbac via django guardian add modeladmin objects for galaxy ng use pulpcore admin basemodel for guardian obj perms subtask of
| 0
|
14,678
| 17,794,818,263
|
IssuesEvent
|
2021-08-31 20:38:28
|
googleapis/nodejs-asset
|
https://api.github.com/repos/googleapis/nodejs-asset
|
closed
|
cleanup old nodejs-asset resources
|
type: process api: cloudasset samples
|
I'm wondering whether the reason this repository is slowing down is that we're collecting old vms and buckets in our account (_leading to gradual slowdown over time_).
We should add logic that cleans up the old VMs and storage buckets created for the asset client.
|
1.0
|
cleanup old nodejs-asset resources - I'm wondering whether the reason this repository is slowing down is that we're collecting old vms and buckets in our account (_leading to gradual slowdown over time_).
We should add logic that cleans up the old VMs and storage buckets created for the asset client.
|
process
|
cleanup old nodejs asset resources i m wondering whether the reason this repository is slowing down is that we re collecting old vms and buckets in our account leading to gradual slowdown over time we should add logic that cleans up the old vms and storage buckets created for the asset client
| 1
|
15,903
| 20,108,459,811
|
IssuesEvent
|
2022-02-07 12:57:08
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
db execute: confusing error message when doing `DROP DATABASE "test-doesnotexists;"` says that `Database `postgres.public` does not exist`
|
bug/1-unconfirmed kind/bug process/candidate topic: error team/migrations topic: db execute
|
Given this schema and command
```prisma
datasource my_db {
provider = "postgresql"
url = "postgres://prisma:prisma@localhost:5432"
}
```
```sh
echo "DROP DATABASE \"test-doesnotexists\";" | npx prisma@dev db execute --stdin --preview-feature --schema prisma/schema.prisma
```
I expected to get a SQL error message but instead got the following
```
Error: P1003
Database `postgres.public` does not exist on the database server at `localhost:5432`.
```
Note that doing the following works and returns `Script executed successfully.`
echo "CREATE DATABASE \"test12344\";" | npx prisma@dev db execute --stdin --preview-feature --schema prisma/schema.prisma
|
1.0
|
db execute: confusing error message when doing `DROP DATABASE "test-doesnotexists;"` says that `Database `postgres.public` does not exist` - Given this schema and command
```prisma
datasource my_db {
provider = "postgresql"
url = "postgres://prisma:prisma@localhost:5432"
}
```
```sh
echo "DROP DATABASE \"test-doesnotexists\";" | npx prisma@dev db execute --stdin --preview-feature --schema prisma/schema.prisma
```
I expected to get a SQL error message but instead got the following
```
Error: P1003
Database `postgres.public` does not exist on the database server at `localhost:5432`.
```
Note that doing the following works and returns `Script executed successfully.`
echo "CREATE DATABASE \"test12344\";" | npx prisma@dev db execute --stdin --preview-feature --schema prisma/schema.prisma
|
process
|
db execute confusing error message when doing drop database test doesnotexists says that database postgres public does not exist given this schema and command prisma datasource my db provider postgresql url postgres prisma prisma localhost sh echo drop database test doesnotexists npx prisma dev db execute stdin preview feature schema prisma schema prisma i expected to get a sql error message but instead got the following error database postgres public does not exist on the database server at localhost note that doing the following works and returns script executed successfully echo create database npx prisma dev db execute stdin preview feature schema prisma schema prisma
| 1
|
11,358
| 14,174,433,968
|
IssuesEvent
|
2020-11-12 19:53:58
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
SAGA Random Terrain generation - Bug
|
Bug Processing
|
When running the Random Terrain Generation from the Toolbox with all settings to default, there is one setting you have to pick.
The only option I have for 'Target Dimensions' is [0] User defined.
So the settings are:
grid_calculus "Random Terrain Generation" -RADIUS 10 -ITERATIONS 10 -TARGET_TYPE 0 -USER_CELL_SIZE 1.0 -USER_COLS 100 -USER_ROWS 100 -TARGET_GRID "%userprofile%/Temp/processing_GIBBERISH/processing_GIBBERISH/GIBBERISH/TARGET_GRID.sdat"
I see a message in the log: Error: select a tool
The following layers were not correctly generated.<ul><li>BLABLA</li></ul>You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.
I get the same results in **QGIS on Windows 7** - 3.10/3.14/3.16
There is no output from this tool. When opening SAGA and run the Tool there is output.
|
1.0
|
SAGA Random Terrain generation - Bug - When running the Random Terrain Generation from the Toolbox with all settings to default, there is one setting you have to pick.
The only option I have for 'Target Dimensions' is [0] User defined.
So the settings are:
grid_calculus "Random Terrain Generation" -RADIUS 10 -ITERATIONS 10 -TARGET_TYPE 0 -USER_CELL_SIZE 1.0 -USER_COLS 100 -USER_ROWS 100 -TARGET_GRID "%userprofile%/Temp/processing_GIBBERISH/processing_GIBBERISH/GIBBERISH/TARGET_GRID.sdat"
I see a message in the log: Error: select a tool
The following layers were not correctly generated.<ul><li>BLABLA</li></ul>You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.
I get the same results in **QGIS on Windows 7** - 3.10/3.14/3.16
There is no output from this tool. When opening SAGA and run the Tool there is output.
|
process
|
saga random terrain generation bug when running the random terrain generation from the toolbox with all settings to default there is one setting you have to pick the only option i have for target dimensions is user defined so the settings are grid calculus random terrain generation radius iterations target type user cell size user cols user rows target grid userprofile temp processing gibberish processing gibberish gibberish target grid sdat i see a message in the log error select a tool the following layers were not correctly generated blabla you can check the log messages panel in qgis main window to find more information about the execution of the algorithm i get the same results in qgis on windows there is no output from this tool when opening saga and run the tool there is output
| 1
|
6
| 2,492,959,702
|
IssuesEvent
|
2015-01-05 09:06:52
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Too many HTML files in output in spite of "onlytopic.in.map=true"
|
bug preprocess xhtml
|
In DITA-OT 2.0, the XHTML transform under special circumstances creates too many HTML files in the output directory, in spite of the parameter `onlytopic.in.map` set to `true`.
Test files: https://gist.github.com/maer2712/d0247e9d121562a2f992
Command line:
```[dita.dir]\bin\dita.bat -f xhtml -i src\test.ditamap -o out\xhtml\test -filter src\test.ditaval -temp t:\tmp -v -l log\antlog.log -Donlytopic.in.map=true```
The transform creates a file `index.html` without any links. This is correct because all content is filtered away. The transform also creates a file `test_inner\Inner.html`, which is not correct.
This problem exists also in DITA-OT 1.8 (XHTML transform and HTMLHelp transform).
|
1.0
|
Too many HTML files in output in spite of "onlytopic.in.map=true" - In DITA-OT 2.0, the XHTML transform under special circumstances creates too many HTML files in the output directory, in spite of the parameter `onlytopic.in.map` set to `true`.
Test files: https://gist.github.com/maer2712/d0247e9d121562a2f992
Command line:
```[dita.dir]\bin\dita.bat -f xhtml -i src\test.ditamap -o out\xhtml\test -filter src\test.ditaval -temp t:\tmp -v -l log\antlog.log -Donlytopic.in.map=true```
The transform creates a file `index.html` without any links. This is correct because all content is filtered away. The transform also creates a file `test_inner\Inner.html`, which is not correct.
This problem exists also in DITA-OT 1.8 (XHTML transform and HTMLHelp transform).
|
process
|
too many html files in output in spite of onlytopic in map true in dita ot the xhtml transform under special circumstances creates too many html files in the output directory in spite of the parameter onlytopic in map set to true test files command line bin dita bat f xhtml i src test ditamap o out xhtml test filter src test ditaval temp t tmp v l log antlog log donlytopic in map true the transform creates a file index html without any links this is correct because all content is filtered away the transform also creates a file test inner inner html which is not correct this problem exists also in dita ot xhtml transform and htmlhelp transform
| 1
|
546,553
| 16,014,585,047
|
IssuesEvent
|
2021-04-20 14:37:26
|
GoogleCloudPlatform/golang-samples
|
https://api.github.com/repos/GoogleCloudPlatform/golang-samples
|
closed
|
bigtable/writes: TestWrites failed
|
api: bigtable flakybot: flaky flakybot: issue priority: p2 samples type: bug
|
This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: e9049803347f279e250436eec63d378c7ee9eb59
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/6478486b-134d-4a2d-b2e2-3fdde1e01516), [Sponge](http://sponge2/6478486b-134d-4a2d-b2e2-3fdde1e01516)
status: failed
<details><summary>Test output</summary><br><pre>retry.go:44: FAILED after 10 attempts:
writes_test.go:45: Could not create table mobile-time-series-golang-samples-tests-9: rpc error: code = AlreadyExists desc = Table already exists: projects/golang-samples-tests/instances/testing-instance/tables/mobile-time-series-golang-samples-tests-9</pre></details>
|
1.0
|
bigtable/writes: TestWrites failed - This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: e9049803347f279e250436eec63d378c7ee9eb59
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/6478486b-134d-4a2d-b2e2-3fdde1e01516), [Sponge](http://sponge2/6478486b-134d-4a2d-b2e2-3fdde1e01516)
status: failed
<details><summary>Test output</summary><br><pre>retry.go:44: FAILED after 10 attempts:
writes_test.go:45: Could not create table mobile-time-series-golang-samples-tests-9: rpc error: code = AlreadyExists desc = Table already exists: projects/golang-samples-tests/instances/testing-instance/tables/mobile-time-series-golang-samples-tests-9</pre></details>
|
non_process
|
bigtable writes testwrites failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output retry go failed after attempts writes test go could not create table mobile time series golang samples tests rpc error code alreadyexists desc table already exists projects golang samples tests instances testing instance tables mobile time series golang samples tests
| 0
|
317,732
| 23,686,240,387
|
IssuesEvent
|
2022-08-29 06:37:12
|
cloud-native-toolkit/terraform-azure-nat-gateway
|
https://api.github.com/repos/cloud-native-toolkit/terraform-azure-nat-gateway
|
closed
|
Readme needs to be updated
|
documentation
|
The readme only contains the module boilerplate and does not include the specifics for this module.
|
1.0
|
Readme needs to be updated - The readme only contains the module boilerplate and does not include the specifics for this module.
|
non_process
|
readme needs to be updated the readme only contains the module boilerplate and does not include the specifics for this module
| 0
|
5,918
| 3,698,327,272
|
IssuesEvent
|
2016-02-28 08:02:54
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
deps: static cares build fails if shared c-ares installed
|
build
|
Followup from patches landed in #38, I've had this problem on both Mac OS X 10.9.5 and FreeBSD 10.2:
```
../deps/cares/src/ares_parse_txt_reply.c:153:25: error: no member named 'record_start' in 'struct ares_txt_reply'
txt_curr->record_start = strptr == aptr;
~~~~~~~~ ^
1 error generated.
```
_Edit, to explain context better:_ There is a breaking API change in Node's cares to support multiple txt records in dns lookup replies. It has not landed upstream - see discussion in #38. When compiling Node's cares, build is referring to the globally installed headers first, which don't include this change to the struct.
This is immediately caused by `-I/usr/local/include` appearing before `-I../deps/cares/include` in the compile command. This, in turn, results from the gyp generated makefile, which orders the includes this way. I note that this is not the order used when compiling V8, where the `-I../deps/v8/include` parameter appears before `-I/usr/local/include`.
To reproduce, install c-ares as a shared library in `/usr/local/...`, then build node with `--with-intl=system-icu --without-npm --shared-openssl`
In discussion at #38, @jbergstroem suggested this related to other shared libraries being included (e.g. zlib), or it may be related to the use of --with-intl=system-icu
nei;tmm:
The exact steps I used on FreeBSD were:
Download VM-image or liveboot iso [for clean env];
Boot...
```csh
# Install c-ares.
pkg install c-ares
# Now try to build node
pkg install gmake libexecinfo python27 icu openssl
mkdir ~/build && cd ~/build
curl https://iojs.org/dist/latest/iojs-v3.2.0.tar.xz | tar -xf -
setenv CC clang
setenv CXX clang++
setenv LINK clang++
setenv GYP_DEFINES clang=1
./configure --with-intl=system-icu --without-npm --shared-openssl --shared-openssl-includes=usr/local/include/openssl --shared-openssl-libpath=/usr/local/lib
patch deps/v8/src/runtime/runtime-i18n.cc:630
- local_object->SetInternalField(1, reinterpret_cast<Smi*>(NULL));
+ local_object->SetInternalField(1, reinterpret_cast<Smi*>(0));
make
```
The patch refered to above is in PR #2636, kindly being shepherded through CI by @thefourtheye. It relates to v8 compilation on FreeBSD, and doesn't otherwise relate to the issue here; just had to patch it to get to this point.
|
1.0
|
deps: static cares build fails if shared c-ares installed - Followup from patches landed in #38, I've had this problem on both Mac OS X 10.9.5 and FreeBSD 10.2:
```
../deps/cares/src/ares_parse_txt_reply.c:153:25: error: no member named 'record_start' in 'struct ares_txt_reply'
txt_curr->record_start = strptr == aptr;
~~~~~~~~ ^
1 error generated.
```
_Edit, to explain context better:_ There is a breaking API change in Node's cares to support multiple txt records in dns lookup replies. It has not landed upstream - see discussion in #38. When compiling Node's cares, build is referring to the globally installed headers first, which don't include this change to the struct.
This is immediately caused by `-I/usr/local/include` appearing before `-I../deps/cares/include` in the compile command. This, in turn, results from the gyp generated makefile, which orders the includes this way. I note that this is not the order used when compiling V8, where the `-I../deps/v8/include` parameter appears before `-I/usr/local/include`.
To reproduce, install c-ares as a shared library in `/usr/local/...`, then build node with `--with-intl=system-icu --without-npm --shared-openssl`
In discussion at #38, @jbergstroem suggested this related to other shared libraries being included (e.g. zlib), or it may be related to the use of --with-intl=system-icu
nei;tmm:
The exact steps I used on FreeBSD were:
Download VM-image or liveboot iso [for clean env];
Boot...
```csh
# Install c-ares.
pkg install c-ares
# Now try to build node
pkg install gmake libexecinfo python27 icu openssl
mkdir ~/build && cd ~/build
curl https://iojs.org/dist/latest/iojs-v3.2.0.tar.xz | tar -xf -
setenv CC clang
setenv CXX clang++
setenv LINK clang++
setenv GYP_DEFINES clang=1
./configure --with-intl=system-icu --without-npm --shared-openssl --shared-openssl-includes=usr/local/include/openssl --shared-openssl-libpath=/usr/local/lib
patch deps/v8/src/runtime/runtime-i18n.cc:630
- local_object->SetInternalField(1, reinterpret_cast<Smi*>(NULL));
+ local_object->SetInternalField(1, reinterpret_cast<Smi*>(0));
make
```
The patch refered to above is in PR #2636, kindly being shepherded through CI by @thefourtheye. It relates to v8 compilation on FreeBSD, and doesn't otherwise relate to the issue here; just had to patch it to get to this point.
|
non_process
|
deps static cares build fails if shared c ares installed followup from patches landed in i ve had this problem on both mac os x and freebsd deps cares src ares parse txt reply c error no member named record start in struct ares txt reply txt curr record start strptr aptr error generated edit to explain context better there is a breaking api change in node s cares to support multiple txt records in dns lookup replies it has not landed upstream see discussion in when compiling node s cares build is referring to the globally installed headers first which don t include this change to the struct this is immediately caused by i usr local include appearing before i deps cares include in the compile command this in turn results from the gyp generated makefile which orders the includes this way i note that this is not the order used when compiling where the i deps include parameter appears before i usr local include to reproduce install c ares as a shared library in usr local then build node with with intl system icu without npm shared openssl in discussion at jbergstroem suggested this related to other shared libraries being included e g zlib or it may be related to the use of with intl system icu nei tmm the exact steps i used on freebsd were download vm image or liveboot iso boot csh install c ares pkg install c ares now try to build node pkg install gmake libexecinfo icu openssl mkdir build cd build curl tar xf setenv cc clang setenv cxx clang setenv link clang setenv gyp defines clang configure with intl system icu without npm shared openssl shared openssl includes usr local include openssl shared openssl libpath usr local lib patch deps src runtime runtime cc local object setinternalfield reinterpret cast null local object setinternalfield reinterpret cast make the patch refered to above is in pr kindly being shepherded through ci by thefourtheye it relates to compilation on freebsd and doesn t otherwise relate to the issue here just had to patch it to get to this point
| 0
|
5,047
| 7,859,215,154
|
IssuesEvent
|
2018-06-21 15:56:24
|
Open-EO/openeo-api
|
https://api.github.com/repos/Open-EO/openeo-api
|
opened
|
Referencing stored process graphs
|
feedback required job management process graph management process graphs processes service management
|
It is possible to send a full process graph to the jobs and services, but how to reference stored process graphs?
Possible solutions:
* Load it with a process_graph process. This is intended to be in the core profile anyway to load external process graphs, therefore this would be a "cheap" solution, but would also be a bit complex for users.
* Directly add a process_graph_id parameter to the relevant endpoints. Then you could EITHER send process_graph or process_graph_id.
A similar question is asked in #92. The way of doing it should be consistent.
|
3.0
|
Referencing stored process graphs - It is possible to send a full process graph to the jobs and services, but how to reference stored process graphs?
Possible solutions:
* Load it with a process_graph process. This is intended to be in the core profile anyway to load external process graphs, therefore this would be a "cheap" solution, but would also be a bit complex for users.
* Directly add a process_graph_id parameter to the relevant endpoints. Then you could EITHER send process_graph or process_graph_id.
A similar question is asked in #92. The way of doing it should be consistent.
|
process
|
referencing stored process graphs it is possible to send a full process graph to the jobs and services but how to reference stored process graphs possible solutions load it with a process graph process this is intended to be in the core profile anyway to load external process graphs therefore this would be a cheap solution but would also be a bit complex for users directly add a process graph id parameter to the relevant endpoints then you could either send process graph or process graph id a similar question is asked in the way of doing it should be consistent
| 1
|
12,130
| 14,740,900,804
|
IssuesEvent
|
2021-01-07 09:47:55
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Keener - billing group codes
|
anc-process anp-urgent ant-enhancement has attachment
|
In GitLab by @kdjstudios on Dec 12, 2018, 13:39
**Submitted by:** Gaylan Garrett <gaylan@keenercom.net>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-12-12-28571/conversation
**Server:** External
**Client/Site:** Keener
**Account:** NA
**Issue:**
As I am sure you are aware, Keener Communications has been purchased by Answer 1 and we are beginning to merge / share our accounts.
Since I will be getting usage from their system as well as usage from our system for billing, I was wondering since their billing group descriptions do not have the same wording as ours, will you need to add that wording to the “formulas” in order for the usage to calculate ? We just merged our first batch of clients today and I will be doing billing on Monday so do not want to miss any usages for any of the accounts.
I remember when we changed from 20 second to Default ( which was also 20 seconds ) it did not work when it came to importing that usage into billing.
Here is what they have. Let me know if you cannot read this as it was sent to me via an email and I cannot read it.

|
1.0
|
Keener - billing group codes - In GitLab by @kdjstudios on Dec 12, 2018, 13:39
**Submitted by:** Gaylan Garrett <gaylan@keenercom.net>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-12-12-28571/conversation
**Server:** External
**Client/Site:** Keener
**Account:** NA
**Issue:**
As I am sure you are aware, Keener Communications has been purchased by Answer 1 and we are beginning to merge / share our accounts.
Since I will be getting usage from their system as well as usage from our system for billing, I was wondering since their billing group descriptions do not have the same wording as ours, will you need to add that wording to the “formulas” in order for the usage to calculate ? We just merged our first batch of clients today and I will be doing billing on Monday so do not want to miss any usages for any of the accounts.
I remember when we changed from 20 second to Default ( which was also 20 seconds ) it did not work when it came to importing that usage into billing.
Here is what they have. Let me know if you cannot read this as it was sent to me via an email and I cannot read it.

|
process
|
keener billing group codes in gitlab by kdjstudios on dec submitted by gaylan garrett helpdesk server external client site keener account na issue as i am sure you are aware keener communications has been purchased by answer and we are beginning to merge share our accounts since i will be getting usage from their system as well as usage from our system for billing i was wondering since their billing group descriptions do not have the same wording as ours will you need to add that wording to the “formulas” in order for the usage to calculate we just merged our first batch of clients today and i will be doing billing on monday so do not want to miss any usages for any of the accounts i remember when we changed from second to default which was also seconds it did not work when it came to importing that usage into billing here is what they have let me know if you cannot read this as it was sent to me via an email and i cannot read it uploads image png
| 1
|
6,628
| 9,733,169,007
|
IssuesEvent
|
2019-05-31 08:59:26
|
zammad/zammad
|
https://api.github.com/repos/zammad/zammad
|
closed
|
ERROR: Can't process email
|
bug mail processing regression verified
|
<!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 2.9.x
* Installation method (source, package, ..): APT
* Operating system: Ubuntu 18.04
* Database + version: ?
* Elasticsearch version: ?
* Browser + version: Chrome lastest
```
~$ sudo zammad run rails r "Channel::EmailParser.process_unprocessable_mails"
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
"ERROR: Can't process email, you will find it for bug reporting under /opt/zammad/tmp/unprocessable_mail/ff6c9cae8fa9fb0cd7cb1c64b5cc2439.eml, please create an issue at https://github.com/zammad/zammad/issues"
"ERROR: #<ArgumentError: invalid byte sequence in UTF-8>"
/opt/zammad/app/models/channel/email_parser.rb:129:in `rescue in process': #<ArgumentError: invalid byte sequence in UTF-8> (RuntimeError)
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `=~'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `!~'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `blank?'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `block in do_decode'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `reject'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `do_decode'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/to_field.rb:51:in `decoded'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_field.rb:27:in `to_s'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/field.rb:158:in `to_s'
/opt/zammad/app/models/channel/filter/identify_sender.rb:123:in `rescue in block in create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:100:in `block in create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:97:in `each'
/opt/zammad/app/models/channel/filter/identify_sender.rb:97:in `create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:64:in `run'
/opt/zammad/app/models/channel/email_parser.rb:148:in `block in _process'
/opt/zammad/app/models/channel/email_parser.rb:145:in `each'
/opt/zammad/app/models/channel/email_parser.rb:145:in `_process'
/opt/zammad/app/models/channel/email_parser.rb:111:in `block in process'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:93:in `block in timeout'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `block in catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:108:in `timeout'
/opt/zammad/app/models/channel/email_parser.rb:110:in `process'
/opt/zammad/app/models/channel/email_parser.rb:479:in `block in process_unprocessable_mails'
/opt/zammad/app/models/channel/email_parser.rb:478:in `glob'
/opt/zammad/app/models/channel/email_parser.rb:478:in `process_unprocessable_mails'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `eval'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in `run'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in `invoke_command'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in `dispatch'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command/base.rb:63:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command.rb:44:in `invoke'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands.rb:16:in `<top (required)>'
/opt/zammad/bin/rails:9:in `require'
/opt/zammad/bin/rails:9:in `<main>'
from /opt/zammad/app/models/channel/email_parser.rb:108:in `process'
from /opt/zammad/app/models/channel/email_parser.rb:479:in `block in process_unprocessable_mails'
from /opt/zammad/app/models/channel/email_parser.rb:478:in `glob'
from /opt/zammad/app/models/channel/email_parser.rb:478:in `process_unprocessable_mails'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `eval'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in `run'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in `invoke_command'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in `dispatch'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command/base.rb:63:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command.rb:44:in `invoke'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands.rb:16:in `<top (required)>'
from /opt/zammad/bin/rails:9:in `require'
from /opt/zammad/bin/rails:9:in `<main>'
```
I can submit the E-Mails wich are not processed, but not here.
What can we do to stop this from happening in the future?
Yes I'm sure this is a bug and no feature request or a general question.
|
1.0
|
ERROR: Can't process email - <!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 2.9.x
* Installation method (source, package, ..): APT
* Operating system: Ubuntu 18.04
* Database + version: ?
* Elasticsearch version: ?
* Browser + version: Chrome lastest
```
~$ sudo zammad run rails r "Channel::EmailParser.process_unprocessable_mails"
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
Encoding conversion failed code converter not found (Windows-1258 to UTF-8)
"ERROR: Can't process email, you will find it for bug reporting under /opt/zammad/tmp/unprocessable_mail/ff6c9cae8fa9fb0cd7cb1c64b5cc2439.eml, please create an issue at https://github.com/zammad/zammad/issues"
"ERROR: #<ArgumentError: invalid byte sequence in UTF-8>"
/opt/zammad/app/models/channel/email_parser.rb:129:in `rescue in process': #<ArgumentError: invalid byte sequence in UTF-8> (RuntimeError)
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `=~'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `!~'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/utilities.rb:286:in `blank?'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `block in do_decode'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `reject'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_address.rb:118:in `do_decode'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/to_field.rb:51:in `decoded'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/fields/common/common_field.rb:27:in `to_s'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/mail-2.6.6/lib/mail/field.rb:158:in `to_s'
/opt/zammad/app/models/channel/filter/identify_sender.rb:123:in `rescue in block in create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:100:in `block in create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:97:in `each'
/opt/zammad/app/models/channel/filter/identify_sender.rb:97:in `create_recipients'
/opt/zammad/app/models/channel/filter/identify_sender.rb:64:in `run'
/opt/zammad/app/models/channel/email_parser.rb:148:in `block in _process'
/opt/zammad/app/models/channel/email_parser.rb:145:in `each'
/opt/zammad/app/models/channel/email_parser.rb:145:in `_process'
/opt/zammad/app/models/channel/email_parser.rb:111:in `block in process'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:93:in `block in timeout'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `block in catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:33:in `catch'
/opt/zammad/vendor/ruby-2.4.4/lib/ruby/2.4.0/timeout.rb:108:in `timeout'
/opt/zammad/app/models/channel/email_parser.rb:110:in `process'
/opt/zammad/app/models/channel/email_parser.rb:479:in `block in process_unprocessable_mails'
/opt/zammad/app/models/channel/email_parser.rb:478:in `glob'
/opt/zammad/app/models/channel/email_parser.rb:478:in `process_unprocessable_mails'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `eval'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in `run'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in `invoke_command'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in `dispatch'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command/base.rb:63:in `perform'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command.rb:44:in `invoke'
/opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands.rb:16:in `<top (required)>'
/opt/zammad/bin/rails:9:in `require'
/opt/zammad/bin/rails:9:in `<main>'
from /opt/zammad/app/models/channel/email_parser.rb:108:in `process'
from /opt/zammad/app/models/channel/email_parser.rb:479:in `block in process_unprocessable_mails'
from /opt/zammad/app/models/channel/email_parser.rb:478:in `glob'
from /opt/zammad/app/models/channel/email_parser.rb:478:in `process_unprocessable_mails'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `eval'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands/runner/runner_command.rb:37:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in `run'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in `invoke_command'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in `dispatch'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command/base.rb:63:in `perform'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/command.rb:44:in `invoke'
from /opt/zammad/vendor/bundle/ruby/2.4.0/gems/railties-5.1.6.2/lib/rails/commands.rb:16:in `<top (required)>'
from /opt/zammad/bin/rails:9:in `require'
from /opt/zammad/bin/rails:9:in `<main>'
```
I can submit the E-Mails wich are not processed, but not here.
What can we do to stop this from happening in the future?
Yes I'm sure this is a bug and no feature request or a general question.
|
process
|
error can t process email hi there thanks for filing an issue please ensure the following things before creating an issue thank you 🤓 since november we handle all requests except real bugs at our community board full explanation please post feature requests development questions technical questions on the board if you think you hit a bug please continue search existing issues and the changelog md for your issue there might be a solution already make sure to use the latest version of zammad if possible add the log production log file from your system attention make sure no confidential data is in it please write the issue in english don t remove the template otherwise we will close the issue without further comments ask questions about zammad configuration and usage at our mailinglist see note we always do our best unfortunately sometimes there are too many requests and we can t handle everything at once if you want to prioritize escalate your issue you can do so by means of a support contract see the upper textblock will be removed automatically when you submit your issue infos used zammad version x installation method source package apt operating system ubuntu database version elasticsearch version browser version chrome lastest sudo zammad run rails r channel emailparser process unprocessable mails encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf encoding conversion failed code converter not found windows to utf error can t process email you will find it for bug reporting under opt zammad tmp unprocessable mail eml please create an issue at error opt zammad app models channel email parser rb in rescue in process runtimeerror opt zammad vendor bundle ruby gems mail lib mail utilities rb in opt zammad vendor bundle ruby gems mail lib mail utilities rb in opt zammad vendor bundle ruby gems mail lib mail utilities rb in blank opt zammad vendor bundle ruby gems mail lib mail fields common common address rb in block in do decode opt zammad vendor bundle ruby gems mail lib mail fields common common address rb in reject opt zammad vendor bundle ruby gems mail lib mail fields common common address rb in do decode opt zammad vendor bundle ruby gems mail lib mail fields to field rb in decoded opt zammad vendor bundle ruby gems mail lib mail fields common common field rb in to s opt zammad vendor bundle ruby gems mail lib mail field rb in to s opt zammad app models channel filter identify sender rb in rescue in block in create recipients opt zammad app models channel filter identify sender rb in block in create recipients opt zammad app models channel filter identify sender rb in each opt zammad app models channel filter identify sender rb in create recipients opt zammad app models channel filter identify sender rb in run opt zammad app models channel email parser rb in block in process opt zammad app models channel email parser rb in each opt zammad app models channel email parser rb in process opt zammad app models channel email parser rb in block in process opt zammad vendor ruby lib ruby timeout rb in block in timeout opt zammad vendor ruby lib ruby timeout rb in block in catch opt zammad vendor ruby lib ruby timeout rb in catch opt zammad vendor ruby lib ruby timeout rb in catch opt zammad vendor ruby lib ruby timeout rb in timeout opt zammad app models channel email parser rb in process opt zammad app models channel email parser rb in block in process unprocessable mails opt zammad app models channel email parser rb in glob opt zammad app models channel email parser rb in process unprocessable mails opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in perform opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in eval opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in perform opt zammad vendor bundle ruby gems thor lib thor command rb in run opt zammad vendor bundle ruby gems thor lib thor invocation rb in invoke command opt zammad vendor bundle ruby gems thor lib thor rb in dispatch opt zammad vendor bundle ruby gems railties lib rails command base rb in perform opt zammad vendor bundle ruby gems railties lib rails command rb in invoke opt zammad vendor bundle ruby gems railties lib rails commands rb in opt zammad bin rails in require opt zammad bin rails in from opt zammad app models channel email parser rb in process from opt zammad app models channel email parser rb in block in process unprocessable mails from opt zammad app models channel email parser rb in glob from opt zammad app models channel email parser rb in process unprocessable mails from opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in perform from opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in eval from opt zammad vendor bundle ruby gems railties lib rails commands runner runner command rb in perform from opt zammad vendor bundle ruby gems thor lib thor command rb in run from opt zammad vendor bundle ruby gems thor lib thor invocation rb in invoke command from opt zammad vendor bundle ruby gems thor lib thor rb in dispatch from opt zammad vendor bundle ruby gems railties lib rails command base rb in perform from opt zammad vendor bundle ruby gems railties lib rails command rb in invoke from opt zammad vendor bundle ruby gems railties lib rails commands rb in from opt zammad bin rails in require from opt zammad bin rails in i can submit the e mails wich are not processed but not here what can we do to stop this from happening in the future yes i m sure this is a bug and no feature request or a general question
| 1
|
13,021
| 15,376,796,584
|
IssuesEvent
|
2021-03-02 16:22:02
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
missing parent :GO:0140418 effector-mediated modulation of host process by symbiont
|
multi-species process
|
GO:0140418 effector-mediated modulation of host process by symbiont
should be
is_a
GO:0044003 modulation by symbiont of host process
|
1.0
|
missing parent :GO:0140418 effector-mediated modulation of host process by symbiont - GO:0140418 effector-mediated modulation of host process by symbiont
should be
is_a
GO:0044003 modulation by symbiont of host process
|
process
|
missing parent go effector mediated modulation of host process by symbiont go effector mediated modulation of host process by symbiont should be is a go modulation by symbiont of host process
| 1
|
84,372
| 16,487,558,395
|
IssuesEvent
|
2021-05-24 20:27:23
|
bpxe/bpxe
|
https://api.github.com/repos/bpxe/bpxe
|
closed
|
Mock clock leaves lock taken
|
code review
|
This pattern repeats:
https://github.com/bpxe/bpxe/blob/0828554a14ebbdc2c38b7564443c0036b0f05ab0/pkg/clock/mock.go#L53-L59
```
m.Lock()
ch := make(chan time.Time, 1)
if duration.Nanoseconds() <= 0 {
ch <- m.now
close(ch)
return ch
}
```
This leaves the lock taken. Should use
```
m.Lock()
defer m.Unlock()
```
|
1.0
|
Mock clock leaves lock taken - This pattern repeats:
https://github.com/bpxe/bpxe/blob/0828554a14ebbdc2c38b7564443c0036b0f05ab0/pkg/clock/mock.go#L53-L59
```
m.Lock()
ch := make(chan time.Time, 1)
if duration.Nanoseconds() <= 0 {
ch <- m.now
close(ch)
return ch
}
```
This leaves the lock taken. Should use
```
m.Lock()
defer m.Unlock()
```
|
non_process
|
mock clock leaves lock taken this pattern repeats m lock ch make chan time time if duration nanoseconds ch m now close ch return ch this leaves the lock taken should use m lock defer m unlock
| 0
|
12,824
| 15,210,170,044
|
IssuesEvent
|
2021-02-17 06:56:53
|
gfx-rs/naga
|
https://api.github.com/repos/gfx-rs/naga
|
opened
|
Move global usage flags to the Analysis
|
area: middle area: processing kind: refactor
|
This is one of the things that belong to "artifacts" of the IR analysis. We used to keep it in the IR itself, but we can now move it to `FunctionInfo` instead.
The tricky part here is that figuring out the global use may require recursive descent into the expression tree, while currently Analysis is only produced by a linear scan.
|
1.0
|
Move global usage flags to the Analysis - This is one of the things that belong to "artifacts" of the IR analysis. We used to keep it in the IR itself, but we can now move it to `FunctionInfo` instead.
The tricky part here is that figuring out the global use may require recursive descent into the expression tree, while currently Analysis is only produced by a linear scan.
|
process
|
move global usage flags to the analysis this is one of the things that belong to artifacts of the ir analysis we used to keep it in the ir itself but we can now move it to functioninfo instead the tricky part here is that figuring out the global use may require recursive descent into the expression tree while currently analysis is only produced by a linear scan
| 1
|
249,744
| 26,968,544,072
|
IssuesEvent
|
2023-02-09 01:27:43
|
monch1962/test-data-mgmt-spike
|
https://api.github.com/repos/monch1962/test-data-mgmt-spike
|
opened
|
CVE-2020-25659 (Medium) detected in cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl
|
security vulnerability
|
## CVE-2020-25659 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/45/f9/ee6878ab822eef403a4282c8ce80d56e3121c9576a6544377df809363b50/cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/45/f9/ee6878ab822eef403a4282c8ce80d56e3121c9576a6544377df809363b50/cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- snowflake_sqlalchemy-1.2.4-py2.py3-none-any.whl (Root Library)
- snowflake-connector-python-2.1.3.tar.gz
- :x: **cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
python-cryptography 3.2 is vulnerable to Bleichenbacher timing attacks in the RSA decryption API, via timed processing of valid PKCS#1 v1.5 ciphertext.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25659>CVE-2020-25659</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476">https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476</a></p>
<p>Release Date: 2021-01-11</p>
<p>Fix Resolution (cryptography): 3.2</p>
<p>Direct dependency fix Resolution (snowflake-sqlalchemy): 1.2.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-25659 (Medium) detected in cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl - ## CVE-2020-25659 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/45/f9/ee6878ab822eef403a4282c8ce80d56e3121c9576a6544377df809363b50/cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/45/f9/ee6878ab822eef403a4282c8ce80d56e3121c9576a6544377df809363b50/cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- snowflake_sqlalchemy-1.2.4-py2.py3-none-any.whl (Root Library)
- snowflake-connector-python-2.1.3.tar.gz
- :x: **cryptography-2.9.2-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
python-cryptography 3.2 is vulnerable to Bleichenbacher timing attacks in the RSA decryption API, via timed processing of valid PKCS#1 v1.5 ciphertext.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25659>CVE-2020-25659</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476">https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476</a></p>
<p>Release Date: 2021-01-11</p>
<p>Fix Resolution (cryptography): 3.2</p>
<p>Direct dependency fix Resolution (snowflake-sqlalchemy): 1.2.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in cryptography whl cve medium severity vulnerability vulnerable library cryptography whl cryptography is a package which provides cryptographic recipes and primitives to python developers library home page a href path to dependency file requirements txt path to vulnerable library requirements txt dependency hierarchy snowflake sqlalchemy none any whl root library snowflake connector python tar gz x cryptography whl vulnerable library found in base branch main vulnerability details python cryptography is vulnerable to bleichenbacher timing attacks in the rsa decryption api via timed processing of valid pkcs ciphertext publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cryptography direct dependency fix resolution snowflake sqlalchemy step up your open source security game with mend
| 0
|
416,975
| 28,107,982,409
|
IssuesEvent
|
2023-03-31 03:32:31
|
FastCampus-MoReturn/final-project-Be
|
https://api.github.com/repos/FastCampus-MoReturn/final-project-Be
|
closed
|
Jenkins CICD setting
|
For: Backend For: CI/CD Priority: Low Status: Available Type: Documentation
|
## Title
Swagger 문서 배포를 통한 Api 명세서 공유
## Description
Jenkins를 활용하여 api 명세서의 변동사항을 실시간으로 배포하여 확인가능하게 진행
## Tasks
- [x] Jenkins init
## References
<!--
-[link text](link address)
-->
|
1.0
|
Jenkins CICD setting - ## Title
Swagger 문서 배포를 통한 Api 명세서 공유
## Description
Jenkins를 활용하여 api 명세서의 변동사항을 실시간으로 배포하여 확인가능하게 진행
## Tasks
- [x] Jenkins init
## References
<!--
-[link text](link address)
-->
|
non_process
|
jenkins cicd setting title swagger 문서 배포를 통한 api 명세서 공유 description jenkins를 활용하여 api 명세서의 변동사항을 실시간으로 배포하여 확인가능하게 진행 tasks jenkins init references link address
| 0
|
6,456
| 9,546,553,131
|
IssuesEvent
|
2019-05-01 20:17:13
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Department of State: Education & Transcript
|
Apply Process Approved Requirements Ready State Dept.
|
Who: Student Applicant
What: Education and Transcript Information
Why: The student is required to provide education and transcript information
A/C
- There will be a header "Education & Transcripts" (Bold)
- "All fields are required" (In the right margin)
- There will be a header "Education"
- Under the header will be a statement- "Please tell us about your current academic standing and list any other education that you have completed to date."
- There will be 3 questions with Yes and No radio buttons
- All Questions will be blank (no default)
- Are you currently enrolled (part-time or full-time) or accepted for enrollment in an accredited college or university?
- Will you be, at a minimum, a college or university junior (i.e. have completed 60 or more undergraduate semester credit hours or 90 or more undergraduate quarter credit hours) by the start of the intern session for which you are applying?
- Will you continue your education after this internship has been completed?
- There will be a 4th question with a text box
- What is your cumulative GPA on a 4.0 scale?
- This box will only allow numbers up to a 4.0
- If the user enters 0 to 2.99 they will be eliminated from the process and receive the ineligible due to GPA message #2874
- There will be a card for each degree with the following information
- A +sign that will expand the card to expose more information
- The type of degree (Master's Degree, Bachelor's Degree) which will be a link to the user's USAJOBS profile
- The name of the college (Bold)
- The date of the degree
- The "+ Add education" link will take the user to a blank "Education and Transctipt" page where the user can enter education information.
- https://opm.invisionapp.com/d/main/#/console/15360465/319289342/preview
- "All fields are required unless otherwise noted" in the right margin
- There will be a warning box in the right rail:
- You must only list degrees from accredited schools or other education programs
- Look up your school (this is a link) at the U.S. Department of Education. This link will open in a new window and link to:
Note: Any new education added during the application process for Open Opps will only be stored on Open Opps and will not be udpated in the USAJOBS profile.
Invision Mock: https://opm.invisionapp.com/d/main/#/console/15360465/319289334/preview
Public Link: https://opm.invisionapp.com/share/ZEPNZR09Q54
|
1.0
|
Department of State: Education & Transcript - Who: Student Applicant
What: Education and Transcript Information
Why: The student is required to provide education and transcript information
A/C
- There will be a header "Education & Transcripts" (Bold)
- "All fields are required" (In the right margin)
- There will be a header "Education"
- Under the header will be a statement- "Please tell us about your current academic standing and list any other education that you have completed to date."
- There will be 3 questions with Yes and No radio buttons
- All Questions will be blank (no default)
- Are you currently enrolled (part-time or full-time) or accepted for enrollment in an accredited college or university?
- Will you be, at a minimum, a college or university junior (i.e. have completed 60 or more undergraduate semester credit hours or 90 or more undergraduate quarter credit hours) by the start of the intern session for which you are applying?
- Will you continue your education after this internship has been completed?
- There will be a 4th question with a text box
- What is your cumulative GPA on a 4.0 scale?
- This box will only allow numbers up to a 4.0
- If the user enters 0 to 2.99 they will be eliminated from the process and receive the ineligible due to GPA message #2874
- There will be a card for each degree with the following information
- A +sign that will expand the card to expose more information
- The type of degree (Master's Degree, Bachelor's Degree) which will be a link to the user's USAJOBS profile
- The name of the college (Bold)
- The date of the degree
- The "+ Add education" link will take the user to a blank "Education and Transctipt" page where the user can enter education information.
- https://opm.invisionapp.com/d/main/#/console/15360465/319289342/preview
- "All fields are required unless otherwise noted" in the right margin
- There will be a warning box in the right rail:
- You must only list degrees from accredited schools or other education programs
- Look up your school (this is a link) at the U.S. Department of Education. This link will open in a new window and link to:
Note: Any new education added during the application process for Open Opps will only be stored on Open Opps and will not be udpated in the USAJOBS profile.
Invision Mock: https://opm.invisionapp.com/d/main/#/console/15360465/319289334/preview
Public Link: https://opm.invisionapp.com/share/ZEPNZR09Q54
|
process
|
department of state education transcript who student applicant what education and transcript information why the student is required to provide education and transcript information a c there will be a header education transcripts bold all fields are required in the right margin there will be a header education under the header will be a statement please tell us about your current academic standing and list any other education that you have completed to date there will be questions with yes and no radio buttons all questions will be blank no default are you currently enrolled part time or full time or accepted for enrollment in an accredited college or university will you be at a minimum a college or university junior i e have completed or more undergraduate semester credit hours or or more undergraduate quarter credit hours by the start of the intern session for which you are applying will you continue your education after this internship has been completed there will be a question with a text box what is your cumulative gpa on a scale this box will only allow numbers up to a if the user enters to they will be eliminated from the process and receive the ineligible due to gpa message there will be a card for each degree with the following information a sign that will expand the card to expose more information the type of degree master s degree bachelor s degree which will be a link to the user s usajobs profile the name of the college bold the date of the degree the add education link will take the user to a blank education and transctipt page where the user can enter education information all fields are required unless otherwise noted in the right margin there will be a warning box in the right rail you must only list degrees from accredited schools or other education programs look up your school this is a link at the u s department of education this link will open in a new window and link to note any new education added during the application process for open opps will only be stored on open opps and will not be udpated in the usajobs profile invision mock public link
| 1
|
19,763
| 26,138,112,333
|
IssuesEvent
|
2022-12-29 14:41:16
|
vesoft-inc/nebula
|
https://api.github.com/repos/vesoft-inc/nebula
|
reopened
|
Incorrect result on edge list join query
|
type/bug severity/major find/automation affects/master process/fixed
|
**Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
Look at the query below:
```txt
(root@nebula) [gdlancer]> match (v1)-[e*1..2]->(v2) where id(v1) in [6] match (v2)-[e*1..2]->(v1) return count(*)
+----------+
| count(*) |
+----------+
| 0 |
+----------+
Got 1 rows (time spent 2.736ms/16.574125ms)
Thu, 29 Dec 2022 14:03:49 CST
```
We can see that Nebula return no result, if we change the return clause into `return size(e)`, it shows that there are 6 rows in the result:
```txt
(root@nebula) [gdlancer]> match (v1)-[e*1..2]->(v2) where id(v1) in [6] match (v2)-[e*1..2]->(v1) return size(e)
+---------+
| size(e) |
+---------+
| 2 |
| 2 |
| 2 |
| 2 |
| 2 |
| 2 |
+---------+
Got 6 rows (time spent 2.583ms/18.1315ms)
Thu, 29 Dec 2022 14:05:48 CST
```
in contrast the same query on the same dataset in Neo4j return 6 rows:
```txt
$ match (v1)-[e*1..2]->(v2) where v1.id in [6] match (v2)-[e*1..2]->(v1) return count(*)
╒══════════╕
│"count(*)"│
╞══════════╡
│6 │
└──────────┘
```
<!-- A clear and concise description of what the bug is. -->
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`) 967a8c9e0 (community edition)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
1.0
|
Incorrect result on edge list join query - **Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
Look at the query below:
```txt
(root@nebula) [gdlancer]> match (v1)-[e*1..2]->(v2) where id(v1) in [6] match (v2)-[e*1..2]->(v1) return count(*)
+----------+
| count(*) |
+----------+
| 0 |
+----------+
Got 1 rows (time spent 2.736ms/16.574125ms)
Thu, 29 Dec 2022 14:03:49 CST
```
We can see that Nebula return no result, if we change the return clause into `return size(e)`, it shows that there are 6 rows in the result:
```txt
(root@nebula) [gdlancer]> match (v1)-[e*1..2]->(v2) where id(v1) in [6] match (v2)-[e*1..2]->(v1) return size(e)
+---------+
| size(e) |
+---------+
| 2 |
| 2 |
| 2 |
| 2 |
| 2 |
| 2 |
+---------+
Got 6 rows (time spent 2.583ms/18.1315ms)
Thu, 29 Dec 2022 14:05:48 CST
```
in contrast the same query on the same dataset in Neo4j return 6 rows:
```txt
$ match (v1)-[e*1..2]->(v2) where v1.id in [6] match (v2)-[e*1..2]->(v1) return count(*)
╒══════════╕
│"count(*)"│
╞══════════╡
│6 │
└──────────┘
```
<!-- A clear and concise description of what the bug is. -->
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`) 967a8c9e0 (community edition)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
process
|
incorrect result on edge list join query please check the faq documentation before raising an issue describe the bug required look at the query below txt root nebula match where id in match return count count got rows time spent thu dec cst we can see that nebula return no result if we change the return clause into return size e it shows that there are rows in the result txt root nebula match where id in match return size e size e got rows time spent thu dec cst in contrast the same query on the same dataset in return rows txt match where id in match return count ╒══════════╕ │ count │ ╞══════════╡ │ │ └──────────┘ your environments required os uname a compiler g version or clang version cpu lscpu commit id e g community edition how to reproduce required steps to reproduce the behavior step step step expected behavior additional context
| 1
|
14,657
| 17,778,387,764
|
IssuesEvent
|
2021-08-30 22:49:58
|
esmero/strawberryfield
|
https://api.github.com/repos/esmero/strawberryfield
|
closed
|
Bug: identify is not running on Draft ADOs
|
bug JSON Postprocessors
|
# Bug
.. and because of that, since the as:structure does get filled up, it never runs, even after publishing. This feels like a regression or a bad mad If() condition. Needs fix before 1.0.0-RC1 is made public
@dmer @alliomeria will fix today
|
1.0
|
Bug: identify is not running on Draft ADOs - # Bug
.. and because of that, since the as:structure does get filled up, it never runs, even after publishing. This feels like a regression or a bad mad If() condition. Needs fix before 1.0.0-RC1 is made public
@dmer @alliomeria will fix today
|
process
|
bug identify is not running on draft ados bug and because of that since the as structure does get filled up it never runs even after publishing this feels like a regression or a bad mad if condition needs fix before is made public dmer alliomeria will fix today
| 1
|
2,193
| 5,038,422,761
|
IssuesEvent
|
2016-12-18 08:03:40
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
Layout error while expanding row in table has hidden column.
|
bug inprocess
|
This error occurs when we have a hidden column and expand row simultaneously.
In ExpandRow example, I add a hidden property in a Table Header Column:
``<TableHeaderColumn dataField='price' hidden>``
when we expand a row, hidden column will become a empty block just like the screenshot below:

|
1.0
|
Layout error while expanding row in table has hidden column. - This error occurs when we have a hidden column and expand row simultaneously.
In ExpandRow example, I add a hidden property in a Table Header Column:
``<TableHeaderColumn dataField='price' hidden>``
when we expand a row, hidden column will become a empty block just like the screenshot below:

|
process
|
layout error while expanding row in table has hidden column this error occurs when we have a hidden column and expand row simultaneously in expandrow example i add a hidden property in a table header column when we expand a row hidden column will become a empty block just like the screenshot below
| 1
|
4,651
| 7,495,777,928
|
IssuesEvent
|
2018-04-08 01:13:34
|
gkiar/reading
|
https://api.github.com/repos/gkiar/reading
|
closed
|
Paper: newpaper
|
neuroscience processing reproducibility stats to read
|
URL: [http://testurl.io](http://testurl.io)
## This paper does...
test note
## This paper does not...
test note
## Other comments?
it works!
|
1.0
|
Paper: newpaper - URL: [http://testurl.io](http://testurl.io)
## This paper does...
test note
## This paper does not...
test note
## Other comments?
it works!
|
process
|
paper newpaper url this paper does test note this paper does not test note other comments it works
| 1
|
55,162
| 13,966,034,997
|
IssuesEvent
|
2020-10-26 01:03:24
|
RG4421/kafka-service-interface
|
https://api.github.com/repos/RG4421/kafka-service-interface
|
closed
|
CVE-2020-9488 (Low) detected in log4j-1.2.17.jar - autoclosed
|
security vulnerability
|
## CVE-2020-9488 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-1.2.17.jar</b></p></summary>
<p>Apache Log4j 1.2</p>
<p>Path to dependency file: kafka-service-interface/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-1.2.17.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RG4421/kafka-service-interface/commit/bfbb0ebe0bda1730284f1eae2783153ddb4a5f8a">bfbb0ebe0bda1730284f1eae2783153ddb4a5f8a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Improper validation of certificate with host mismatch in Apache Log4j SMTP appender. This could allow an SMTPS connection to be intercepted by a man-in-the-middle attack which could leak any log messages sent through that appender.
<p>Publish Date: 2020-04-27
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9488>CVE-2020-9488</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://issues.apache.org/jira/browse/LOG4J2-2819">https://issues.apache.org/jira/browse/LOG4J2-2819</a></p>
<p>Release Date: 2020-04-27</p>
<p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.13.2</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"log4j","packageName":"log4j","packageVersion":"1.2.17","isTransitiveDependency":false,"dependencyTree":"log4j:log4j:1.2.17","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.13.2"}],"vulnerabilityIdentifier":"CVE-2020-9488","vulnerabilityDetails":"Improper validation of certificate with host mismatch in Apache Log4j SMTP appender. This could allow an SMTPS connection to be intercepted by a man-in-the-middle attack which could leak any log messages sent through that appender.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9488","cvss3Severity":"low","cvss3Score":"3.7","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-9488 (Low) detected in log4j-1.2.17.jar - autoclosed - ## CVE-2020-9488 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-1.2.17.jar</b></p></summary>
<p>Apache Log4j 1.2</p>
<p>Path to dependency file: kafka-service-interface/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-1.2.17.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RG4421/kafka-service-interface/commit/bfbb0ebe0bda1730284f1eae2783153ddb4a5f8a">bfbb0ebe0bda1730284f1eae2783153ddb4a5f8a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Improper validation of certificate with host mismatch in Apache Log4j SMTP appender. This could allow an SMTPS connection to be intercepted by a man-in-the-middle attack which could leak any log messages sent through that appender.
<p>Publish Date: 2020-04-27
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9488>CVE-2020-9488</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://issues.apache.org/jira/browse/LOG4J2-2819">https://issues.apache.org/jira/browse/LOG4J2-2819</a></p>
<p>Release Date: 2020-04-27</p>
<p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.13.2</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"log4j","packageName":"log4j","packageVersion":"1.2.17","isTransitiveDependency":false,"dependencyTree":"log4j:log4j:1.2.17","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.13.2"}],"vulnerabilityIdentifier":"CVE-2020-9488","vulnerabilityDetails":"Improper validation of certificate with host mismatch in Apache Log4j SMTP appender. This could allow an SMTPS connection to be intercepted by a man-in-the-middle attack which could leak any log messages sent through that appender.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9488","cvss3Severity":"low","cvss3Score":"3.7","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve low detected in jar autoclosed cve low severity vulnerability vulnerable library jar apache path to dependency file kafka service interface pom xml path to vulnerable library canner repository jar dependency hierarchy x jar vulnerable library found in head commit a href found in base branch master vulnerability details improper validation of certificate with host mismatch in apache smtp appender this could allow an smtps connection to be intercepted by a man in the middle attack which could leak any log messages sent through that appender publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache logging core rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails improper validation of certificate with host mismatch in apache smtp appender this could allow an smtps connection to be intercepted by a man in the middle attack which could leak any log messages sent through that appender vulnerabilityurl
| 0
|
196,127
| 14,813,773,603
|
IssuesEvent
|
2021-01-14 02:57:35
|
SHUReeducation/autoAPI
|
https://api.github.com/repos/SHUReeducation/autoAPI
|
closed
|
复杂查询测试
|
feature request medium medium priority test
|
<!--
下面的内容可以使用中文或者英文填写。
-->
<!--
You can fill the following things by using English or Chinese.
-->
<!--
注意:请不要要求我们支持Java,我们永远不会考虑支持Java。
-->
<!--
Warning: Never ask us to support Java, we'll never do that.
-->
**Describe the solution you'd like**
添加对复杂查询的测试。
**Additional context**
可以在 #56 找到添加功能测试的方法。
|
1.0
|
复杂查询测试 - <!--
下面的内容可以使用中文或者英文填写。
-->
<!--
You can fill the following things by using English or Chinese.
-->
<!--
注意:请不要要求我们支持Java,我们永远不会考虑支持Java。
-->
<!--
Warning: Never ask us to support Java, we'll never do that.
-->
**Describe the solution you'd like**
添加对复杂查询的测试。
**Additional context**
可以在 #56 找到添加功能测试的方法。
|
non_process
|
复杂查询测试 下面的内容可以使用中文或者英文填写。 you can fill the following things by using english or chinese 注意:请不要要求我们支持java,我们永远不会考虑支持java。 warning never ask us to support java we ll never do that describe the solution you d like 添加对复杂查询的测试。 additional context 可以在 找到添加功能测试的方法。
| 0
|
19,057
| 25,075,873,569
|
IssuesEvent
|
2022-11-07 15:27:16
|
Tencent/tdesign-miniprogram
|
https://api.github.com/repos/Tencent/tdesign-miniprogram
|
closed
|
t-calendar 日历组件
|
bug good first issue in process
|
### tdesign-miniprogram 版本
0.24.0-beta.0
### 重现链接
_No response_
### 重现步骤
开发者工具上,在日历弹出后,可以通过鼠标滚轮滚动进行日期的上下翻页查看,但是不能通过触摸滑动进行类似的操作,这样导致真机上无法进行日历的上下操作
### 期望结果
希望添加触摸事件的操作,真机上可以进行日历的上下操作
### 实际结果
_No response_
### 框架版本
原生小程序
### 浏览器版本
微信开发者工具 stable 1.06.2206090
### 系统版本
win10家庭中文版
### Node版本
16.13.1
### 补充说明
_No response_
|
1.0
|
t-calendar 日历组件 - ### tdesign-miniprogram 版本
0.24.0-beta.0
### 重现链接
_No response_
### 重现步骤
开发者工具上,在日历弹出后,可以通过鼠标滚轮滚动进行日期的上下翻页查看,但是不能通过触摸滑动进行类似的操作,这样导致真机上无法进行日历的上下操作
### 期望结果
希望添加触摸事件的操作,真机上可以进行日历的上下操作
### 实际结果
_No response_
### 框架版本
原生小程序
### 浏览器版本
微信开发者工具 stable 1.06.2206090
### 系统版本
win10家庭中文版
### Node版本
16.13.1
### 补充说明
_No response_
|
process
|
t calendar 日历组件 tdesign miniprogram 版本 beta 重现链接 no response 重现步骤 开发者工具上,在日历弹出后,可以通过鼠标滚轮滚动进行日期的上下翻页查看,但是不能通过触摸滑动进行类似的操作,这样导致真机上无法进行日历的上下操作 期望结果 希望添加触摸事件的操作,真机上可以进行日历的上下操作 实际结果 no response 框架版本 原生小程序 浏览器版本 微信开发者工具 stable 系统版本 node版本 补充说明 no response
| 1
|
97,765
| 12,261,054,363
|
IssuesEvent
|
2020-05-06 19:23:14
|
meloneminze/famly
|
https://api.github.com/repos/meloneminze/famly
|
closed
|
Create logo
|
design
|
### User Story
As a developer and as a brand I need a logo!
### Clarification
##To do
Create :
- [ ] ducks with a nicer pecker
## Criteria
- [ ] Use design prototype
- [ ] Use storybook to build the components
- [ ] different size
### Material | Images
Design:
https://xd.adobe.com/view/7f86dc69-703e-48db-70d4-8d3d07276926-09b7/
Links examples:

|
1.0
|
Create logo - ### User Story
As a developer and as a brand I need a logo!
### Clarification
##To do
Create :
- [ ] ducks with a nicer pecker
## Criteria
- [ ] Use design prototype
- [ ] Use storybook to build the components
- [ ] different size
### Material | Images
Design:
https://xd.adobe.com/view/7f86dc69-703e-48db-70d4-8d3d07276926-09b7/
Links examples:

|
non_process
|
create logo user story as a developer and as a brand i need a logo clarification to do create ducks with a nicer pecker criteria use design prototype use storybook to build the components different size material images design links examples
| 0
|
363,989
| 10,757,622,847
|
IssuesEvent
|
2019-10-31 13:37:27
|
infor-design/enterprise
|
https://api.github.com/repos/infor-design/enterprise
|
closed
|
Toolbar contrast is changing in iOS devices
|
[1] priority: minor status: cant reproduce type: bug :bug:
|
<!-- Please be aware that this is a publicly visible bug report. Do not post any credentials, screenshots with proprietary information, or anything you think shouldn't be visible to the world. If private information is required to be shared for a quality bug report, please email one of the [code owners](https://github.com/infor-design/enterprise/blob/master/.github/CODEOWNERS) directly. -->
**Describe the bug**
Toolbar contrast is changing in iOS devices
**To Reproduce**
Steps to reproduce the behavior:
1. Open iOS Device
2, Go to http://master-enterprise.demo.design.infor.com/components/toolbar/example-full-toolbar.html?
3. Notice that there is a difference in contrast in the label in the toolbar
**Expected behavior**
Color and contrast should be consistent
**Version**
Beta 4.23
**Screenshots**
If applicable, add screenshots to help explain your problem.

**Platform**
iOS Mobile Devices
Tested in iPhone XR
|
1.0
|
Toolbar contrast is changing in iOS devices - <!-- Please be aware that this is a publicly visible bug report. Do not post any credentials, screenshots with proprietary information, or anything you think shouldn't be visible to the world. If private information is required to be shared for a quality bug report, please email one of the [code owners](https://github.com/infor-design/enterprise/blob/master/.github/CODEOWNERS) directly. -->
**Describe the bug**
Toolbar contrast is changing in iOS devices
**To Reproduce**
Steps to reproduce the behavior:
1. Open iOS Device
2, Go to http://master-enterprise.demo.design.infor.com/components/toolbar/example-full-toolbar.html?
3. Notice that there is a difference in contrast in the label in the toolbar
**Expected behavior**
Color and contrast should be consistent
**Version**
Beta 4.23
**Screenshots**
If applicable, add screenshots to help explain your problem.

**Platform**
iOS Mobile Devices
Tested in iPhone XR
|
non_process
|
toolbar contrast is changing in ios devices describe the bug toolbar contrast is changing in ios devices to reproduce steps to reproduce the behavior open ios device go to notice that there is a difference in contrast in the label in the toolbar expected behavior color and contrast should be consistent version beta screenshots if applicable add screenshots to help explain your problem platform ios mobile devices tested in iphone xr
| 0
|
207,041
| 7,123,990,058
|
IssuesEvent
|
2018-01-19 17:09:58
|
carbon-design-system/carbon-components
|
https://api.github.com/repos/carbon-design-system/carbon-components
|
closed
|
Interesting “double outinline” on DataTable
|
bug priority: high
|
`.bx--overflow-menu:focus` has a box-shadow and the child svg has one as well `.bx--data-table-v2 .bx--overflow-menu:focus .bx--overflow-menu__icon`

|
1.0
|
Interesting “double outinline” on DataTable - `.bx--overflow-menu:focus` has a box-shadow and the child svg has one as well `.bx--data-table-v2 .bx--overflow-menu:focus .bx--overflow-menu__icon`

|
non_process
|
interesting “double outinline” on datatable bx overflow menu focus has a box shadow and the child svg has one as well bx data table bx overflow menu focus bx overflow menu icon
| 0
|
19,463
| 25,758,410,415
|
IssuesEvent
|
2022-12-08 18:15:43
|
mdsreq-fga-unb/2022.2-Receitalista
|
https://api.github.com/repos/mdsreq-fga-unb/2022.2-Receitalista
|
closed
|
Processo de Desenvolvimento de Software
|
Processo
|
O que o processo descrito tem a ver com o projeto de vcs? que características são compatíveis ou incompatíveis, por exemplo?
Lembrem-se do CASOS que foram avaliados em sala.
|
1.0
|
Processo de Desenvolvimento de Software - O que o processo descrito tem a ver com o projeto de vcs? que características são compatíveis ou incompatíveis, por exemplo?
Lembrem-se do CASOS que foram avaliados em sala.
|
process
|
processo de desenvolvimento de software o que o processo descrito tem a ver com o projeto de vcs que características são compatíveis ou incompatíveis por exemplo lembrem se do casos que foram avaliados em sala
| 1
|
1,713
| 4,351,896,400
|
IssuesEvent
|
2016-08-01 02:44:00
|
VietOpenCPS/opencps
|
https://api.github.com/repos/VietOpenCPS/opencps
|
closed
|
Sau khi tạo mới mẫu hồ sơ, nhấn nút lưu thì hệ thống không giữ nguyên màn hình
|
bug fixed service_process
|
Sau khi tạo mới mẫu hồ sơ, nhấn nút lưu thì hệ thống không giữ nguyên màn hình (hiện tại đang quay lại danh sách mẫu hồ sơ)
|
1.0
|
Sau khi tạo mới mẫu hồ sơ, nhấn nút lưu thì hệ thống không giữ nguyên màn hình - Sau khi tạo mới mẫu hồ sơ, nhấn nút lưu thì hệ thống không giữ nguyên màn hình (hiện tại đang quay lại danh sách mẫu hồ sơ)
|
process
|
sau khi tạo mới mẫu hồ sơ nhấn nút lưu thì hệ thống không giữ nguyên màn hình sau khi tạo mới mẫu hồ sơ nhấn nút lưu thì hệ thống không giữ nguyên màn hình hiện tại đang quay lại danh sách mẫu hồ sơ
| 1
|
342,863
| 24,759,505,416
|
IssuesEvent
|
2022-10-21 21:36:24
|
OpenZeppelin/nile
|
https://api.github.com/repos/OpenZeppelin/nile
|
closed
|
DOC: Update CONTRIBUTING guide to describe project setup
|
documentation
|
This is a documentation related issue. I think it would be great to have the steps for setting up the project before contributing:
Ex: Should dependencies be installed through `pip install .` or `python3 setup.py develop`. Tox must be installed manually after this? Etc...
|
1.0
|
DOC: Update CONTRIBUTING guide to describe project setup - This is a documentation related issue. I think it would be great to have the steps for setting up the project before contributing:
Ex: Should dependencies be installed through `pip install .` or `python3 setup.py develop`. Tox must be installed manually after this? Etc...
|
non_process
|
doc update contributing guide to describe project setup this is a documentation related issue i think it would be great to have the steps for setting up the project before contributing ex should dependencies be installed through pip install or setup py develop tox must be installed manually after this etc
| 0
|
392,074
| 26,922,954,730
|
IssuesEvent
|
2023-02-07 11:49:22
|
serenity-js/serenity-js
|
https://api.github.com/repos/serenity-js/serenity-js
|
closed
|
Serenity/JS 3.0 - protractor-cucumber example broken
|
documentation good-first-issue
|
Hello all,
I have a similar problem like #954, but the error comes from protractor-cucumber example.
To reproduce:
- Start with a clean repo, by downloading the latest build.
- npm i to install all dependencies.
- npm run test in folder 'serenity-js/examples/protractor-cucumber'
- We'll get the following error:
```
[test:acceptance] > @examples/protractor-cucumber@3.0.0 test:acceptance /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber
[test:acceptance] > protractor ./protractor.conf.js
[test:acceptance]
[test:acceptance] [18:45:33] I/launcher - Running 1 instances of WebDriver
[test:acceptance] [18:45:33] I/direct - Using ChromeDriver directly...
[test:acceptance] [18:45:34] E/launcher - Error: /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/features/support/setup.ts:1
[test:acceptance] import { setDefaultTimeout } from '@cucumber/cucumber';
[test:acceptance] ^^^^^^
[test:acceptance]
[test:acceptance] SyntaxError: Cannot use import statement outside a module
[test:acceptance] at wrapSafe (internal/modules/cjs/loader.js:979:16)
[test:acceptance] at Module._compile (internal/modules/cjs/loader.js:1027:27)
[test:acceptance] at Object.Module._extensions..js (internal/modules/cjs/loader.js:1092:10)
[test:acceptance] at Module.load (internal/modules/cjs/loader.js:928:32)
[test:acceptance] at Function.Module._load (internal/modules/cjs/loader.js:769:14)
[test:acceptance] at Module.require (internal/modules/cjs/loader.js:952:19)
[test:acceptance] at require (internal/modules/cjs/helpers.js:88:18)
[test:acceptance] at /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/node_modules/@cucumber/cucumber/src/api/support.ts:28:30
[test:acceptance] at Array.map (<anonymous>)
[test:acceptance] at getSupportCodeLibrary (/Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/node_modules/@cucumber/cucumber/src/api/support.ts:28:16)
[test:acceptance] [18:45:34] E/launcher - Process exited with error code 100
[test:acceptance] npm ERR! code ELIFECYCLE
[test:acceptance] npm ERR! errno 100
[test:acceptance] npm ERR! @examples/protractor-cucumber@3.0.0 test:acceptance: `protractor ./protractor.conf.js`
[test:acceptance] npm ERR! Exit status 100
[test:acceptance] npm ERR!
[test:acceptance] npm ERR! Failed at the @examples/protractor-cucumber@3.0.0 test:acceptance script.
[test:acceptance] npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
[test:acceptance]
[test:acceptance] npm ERR! A complete log of this run can be found in:
[test:acceptance] npm ERR! /Users/ruwang/.npm/_logs/2023-02-03T23_45_34_563Z-debug.log
[failsafe] Script 'test:acceptance' exited with code 100
```
Indeed, in my personal project with 2.33.10, it doesn't have the same problem. All is fine.
What's more, I tried the 'examples/protractor-mocha' and 'examples/protractor-jasmine' etc, all is fine as well.
I'm running the test on Mac OS Ventura 13.1, Node v14.15.3.
Please let me know if anything I can do on my end.
Thank you so much!
|
1.0
|
Serenity/JS 3.0 - protractor-cucumber example broken - Hello all,
I have a similar problem like #954, but the error comes from protractor-cucumber example.
To reproduce:
- Start with a clean repo, by downloading the latest build.
- npm i to install all dependencies.
- npm run test in folder 'serenity-js/examples/protractor-cucumber'
- We'll get the following error:
```
[test:acceptance] > @examples/protractor-cucumber@3.0.0 test:acceptance /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber
[test:acceptance] > protractor ./protractor.conf.js
[test:acceptance]
[test:acceptance] [18:45:33] I/launcher - Running 1 instances of WebDriver
[test:acceptance] [18:45:33] I/direct - Using ChromeDriver directly...
[test:acceptance] [18:45:34] E/launcher - Error: /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/features/support/setup.ts:1
[test:acceptance] import { setDefaultTimeout } from '@cucumber/cucumber';
[test:acceptance] ^^^^^^
[test:acceptance]
[test:acceptance] SyntaxError: Cannot use import statement outside a module
[test:acceptance] at wrapSafe (internal/modules/cjs/loader.js:979:16)
[test:acceptance] at Module._compile (internal/modules/cjs/loader.js:1027:27)
[test:acceptance] at Object.Module._extensions..js (internal/modules/cjs/loader.js:1092:10)
[test:acceptance] at Module.load (internal/modules/cjs/loader.js:928:32)
[test:acceptance] at Function.Module._load (internal/modules/cjs/loader.js:769:14)
[test:acceptance] at Module.require (internal/modules/cjs/loader.js:952:19)
[test:acceptance] at require (internal/modules/cjs/helpers.js:88:18)
[test:acceptance] at /Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/node_modules/@cucumber/cucumber/src/api/support.ts:28:30
[test:acceptance] at Array.map (<anonymous>)
[test:acceptance] at getSupportCodeLibrary (/Users/ruwang/workcenter/serenity-js_v3/examples/protractor-cucumber/node_modules/@cucumber/cucumber/src/api/support.ts:28:16)
[test:acceptance] [18:45:34] E/launcher - Process exited with error code 100
[test:acceptance] npm ERR! code ELIFECYCLE
[test:acceptance] npm ERR! errno 100
[test:acceptance] npm ERR! @examples/protractor-cucumber@3.0.0 test:acceptance: `protractor ./protractor.conf.js`
[test:acceptance] npm ERR! Exit status 100
[test:acceptance] npm ERR!
[test:acceptance] npm ERR! Failed at the @examples/protractor-cucumber@3.0.0 test:acceptance script.
[test:acceptance] npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
[test:acceptance]
[test:acceptance] npm ERR! A complete log of this run can be found in:
[test:acceptance] npm ERR! /Users/ruwang/.npm/_logs/2023-02-03T23_45_34_563Z-debug.log
[failsafe] Script 'test:acceptance' exited with code 100
```
Indeed, in my personal project with 2.33.10, it doesn't have the same problem. All is fine.
What's more, I tried the 'examples/protractor-mocha' and 'examples/protractor-jasmine' etc, all is fine as well.
I'm running the test on Mac OS Ventura 13.1, Node v14.15.3.
Please let me know if anything I can do on my end.
Thank you so much!
|
non_process
|
serenity js protractor cucumber example broken hello all i have a similar problem like but the error comes from protractor cucumber example to reproduce start with a clean repo by downloading the latest build npm i to install all dependencies npm run test in folder serenity js examples protractor cucumber we ll get the following error examples protractor cucumber test acceptance users ruwang workcenter serenity js examples protractor cucumber protractor protractor conf js i launcher running instances of webdriver i direct using chromedriver directly e launcher error users ruwang workcenter serenity js examples protractor cucumber features support setup ts import setdefaulttimeout from cucumber cucumber syntaxerror cannot use import statement outside a module at wrapsafe internal modules cjs loader js at module compile internal modules cjs loader js at object module extensions js internal modules cjs loader js at module load internal modules cjs loader js at function module load internal modules cjs loader js at module require internal modules cjs loader js at require internal modules cjs helpers js at users ruwang workcenter serenity js examples protractor cucumber node modules cucumber cucumber src api support ts at array map at getsupportcodelibrary users ruwang workcenter serenity js examples protractor cucumber node modules cucumber cucumber src api support ts e launcher process exited with error code npm err code elifecycle npm err errno npm err examples protractor cucumber test acceptance protractor protractor conf js npm err exit status npm err npm err failed at the examples protractor cucumber test acceptance script npm err this is probably not a problem with npm there is likely additional logging output above npm err a complete log of this run can be found in npm err users ruwang npm logs debug log script test acceptance exited with code indeed in my personal project with it doesn t have the same problem all is fine what s more i tried the examples protractor mocha and examples protractor jasmine etc all is fine as well i m running the test on mac os ventura node please let me know if anything i can do on my end thank you so much
| 0
|
3,291
| 6,384,795,269
|
IssuesEvent
|
2017-08-03 06:36:35
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
False Inspection when procedure scoped variable name matches class member name
|
bug feature-inspections parse-tree-processing
|
This is a resolver issue that is exposed as a false positive inspection result for _Return Value for member 'x' is never assigned_.
The following code produces 2 false-positive inspection results:
- Return value for member 'Foo' is never assigned
- Return value for member 'Bar' is never assigned
- Return value for member 'Fizz' is never assigned
If you rename the variables in `Fizz`, so they don't match the names of the properties, then the inspections are not present.
``` vb
Option Explicit
Public Property Get Foo() As Long
Foo = 5
End Property
Public Property Get Bar() As Long
Bar = 6
End Property
Public Function Fizz() As Long
Fizz = 2
End Function
Public Sub Buzz()
Dim Foo As Long
Dim Bar As Long
Dim Fizz As Long
Foo = 7
Bar = 8
Fizz = 9
Debug.Print Foo, Bar, Fizz '7 8 9
End Sub
```
|
1.0
|
False Inspection when procedure scoped variable name matches class member name - This is a resolver issue that is exposed as a false positive inspection result for _Return Value for member 'x' is never assigned_.
The following code produces 2 false-positive inspection results:
- Return value for member 'Foo' is never assigned
- Return value for member 'Bar' is never assigned
- Return value for member 'Fizz' is never assigned
If you rename the variables in `Fizz`, so they don't match the names of the properties, then the inspections are not present.
``` vb
Option Explicit
Public Property Get Foo() As Long
Foo = 5
End Property
Public Property Get Bar() As Long
Bar = 6
End Property
Public Function Fizz() As Long
Fizz = 2
End Function
Public Sub Buzz()
Dim Foo As Long
Dim Bar As Long
Dim Fizz As Long
Foo = 7
Bar = 8
Fizz = 9
Debug.Print Foo, Bar, Fizz '7 8 9
End Sub
```
|
process
|
false inspection when procedure scoped variable name matches class member name this is a resolver issue that is exposed as a false positive inspection result for return value for member x is never assigned the following code produces false positive inspection results return value for member foo is never assigned return value for member bar is never assigned return value for member fizz is never assigned if you rename the variables in fizz so they don t match the names of the properties then the inspections are not present vb option explicit public property get foo as long foo end property public property get bar as long bar end property public function fizz as long fizz end function public sub buzz dim foo as long dim bar as long dim fizz as long foo bar fizz debug print foo bar fizz end sub
| 1
|
13,781
| 16,540,558,068
|
IssuesEvent
|
2021-05-27 16:16:00
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
closed
|
21.05 Feature Freeze
|
6.topic: release process
|
Pinging all language, framework, and ecosystem owners to consolidate feature freeze items for the 21.05 release.
Please mention any items you see blocking the 21.05 release in your given domains. The branch off date will be the on the 21st of May. So there is still some time to address these items.
Nix/nix-cli ecosystem: @edolstra @grahamc @nbp @Profpatsch
Mobile: @samueldr
Nixos Modules / internals : @Infinisil @matthewbauer @Ericson2314 @orivej
Nixos tests: @tfc
Marketing: @garbas
Emacs: @adisbladis
Erlang: @gleber @NixOS/beam
Go: @kalbasit @Mic92 @zowoq
Haskell: @NixOS/haskell
Python: @FRidh
Perl: @stigtsp
php: @NixOS/php
Ruby: @alyssais
rust: @bhipple @Mic92 @andir @LnL7
Darwin: @NixOS/darwin-maintainers
bazel: @mboes
blockchains @mmahut
podman: @NixOS/podman
Gnome: @jtojnar @NixOS/gnome
Qt / KDE: @ttuegel @NixOS/qt-kde
Postgres: @thoughtpolice
in case I forgot anyone: @NixOS/nixpkgs-committers
Anyone is free to propose potential blockers, but I would ask that you remember that this is a volunteer organization. Unless someone is likely to "pick up" the work and address the concern in the coming weeks, please only state critical issues. Or if anyone is active in a given ecosystem and I did not mention them, then they are free to state that there's unlikely to be any concerns as well.
|
1.0
|
21.05 Feature Freeze - Pinging all language, framework, and ecosystem owners to consolidate feature freeze items for the 21.05 release.
Please mention any items you see blocking the 21.05 release in your given domains. The branch off date will be the on the 21st of May. So there is still some time to address these items.
Nix/nix-cli ecosystem: @edolstra @grahamc @nbp @Profpatsch
Mobile: @samueldr
Nixos Modules / internals : @Infinisil @matthewbauer @Ericson2314 @orivej
Nixos tests: @tfc
Marketing: @garbas
Emacs: @adisbladis
Erlang: @gleber @NixOS/beam
Go: @kalbasit @Mic92 @zowoq
Haskell: @NixOS/haskell
Python: @FRidh
Perl: @stigtsp
php: @NixOS/php
Ruby: @alyssais
rust: @bhipple @Mic92 @andir @LnL7
Darwin: @NixOS/darwin-maintainers
bazel: @mboes
blockchains @mmahut
podman: @NixOS/podman
Gnome: @jtojnar @NixOS/gnome
Qt / KDE: @ttuegel @NixOS/qt-kde
Postgres: @thoughtpolice
in case I forgot anyone: @NixOS/nixpkgs-committers
Anyone is free to propose potential blockers, but I would ask that you remember that this is a volunteer organization. Unless someone is likely to "pick up" the work and address the concern in the coming weeks, please only state critical issues. Or if anyone is active in a given ecosystem and I did not mention them, then they are free to state that there's unlikely to be any concerns as well.
|
process
|
feature freeze pinging all language framework and ecosystem owners to consolidate feature freeze items for the release please mention any items you see blocking the release in your given domains the branch off date will be the on the of may so there is still some time to address these items nix nix cli ecosystem edolstra grahamc nbp profpatsch mobile samueldr nixos modules internals infinisil matthewbauer orivej nixos tests tfc marketing garbas emacs adisbladis erlang gleber nixos beam go kalbasit zowoq haskell nixos haskell python fridh perl stigtsp php nixos php ruby alyssais rust bhipple andir darwin nixos darwin maintainers bazel mboes blockchains mmahut podman nixos podman gnome jtojnar nixos gnome qt kde ttuegel nixos qt kde postgres thoughtpolice in case i forgot anyone nixos nixpkgs committers anyone is free to propose potential blockers but i would ask that you remember that this is a volunteer organization unless someone is likely to pick up the work and address the concern in the coming weeks please only state critical issues or if anyone is active in a given ecosystem and i did not mention them then they are free to state that there s unlikely to be any concerns as well
| 1
|
102,213
| 31,862,163,660
|
IssuesEvent
|
2023-09-15 11:45:41
|
PaddlePaddle/Paddle
|
https://api.github.com/repos/PaddlePaddle/Paddle
|
closed
|
编译paddle2.5失败
|
status/new-issue type/build
|
### 问题描述 Issue Description
问题(1): 官方没有paddle2.5.1+cuda12.0在python3.7下的whl包,能帮忙提供吗?感谢!
问题(2)如图,在python3.7中编译paddle release/2.5报错,取掉-DCUDA_ARCH_NAME=All则ok; 但是我需要编译出python37下通用版本的whl, 所以我理解CUDA_ARCH_NAME是必要的?麻烦帮忙看下,感谢!
cmake .. -DPY_VERSION=3.7 -DWITH_GPU=ON -DCUDA_ARCH_NAME=All
编译环境:docker pull paddlepaddle/paddle:2.5.1-gpu-cuda12.0-cudnn8.9-trt8.6

### 版本&环境信息 Version & Environment Information
paddle relase/2.5 cuda12.0
用的官方环境:paddlepaddle/paddle:2.5.1-gpu-cuda12.0-cudnn8.9-trt8.6
|
1.0
|
编译paddle2.5失败 - ### 问题描述 Issue Description
问题(1): 官方没有paddle2.5.1+cuda12.0在python3.7下的whl包,能帮忙提供吗?感谢!
问题(2)如图,在python3.7中编译paddle release/2.5报错,取掉-DCUDA_ARCH_NAME=All则ok; 但是我需要编译出python37下通用版本的whl, 所以我理解CUDA_ARCH_NAME是必要的?麻烦帮忙看下,感谢!
cmake .. -DPY_VERSION=3.7 -DWITH_GPU=ON -DCUDA_ARCH_NAME=All
编译环境:docker pull paddlepaddle/paddle:2.5.1-gpu-cuda12.0-cudnn8.9-trt8.6

### 版本&环境信息 Version & Environment Information
paddle relase/2.5 cuda12.0
用的官方环境:paddlepaddle/paddle:2.5.1-gpu-cuda12.0-cudnn8.9-trt8.6
|
non_process
|
问题描述 issue description 问题 ,能帮忙提供吗?感谢! 问题 如图, release ,取掉 dcuda arch name all则ok , 所以我理解cuda arch name是必要的?麻烦帮忙看下,感谢! cmake dpy version dwith gpu on dcuda arch name all 编译环境:docker pull paddlepaddle paddle gpu 版本 环境信息 version environment information paddle relase 用的官方环境:paddlepaddle paddle gpu
| 0
|
68,997
| 3,294,830,807
|
IssuesEvent
|
2015-10-31 11:51:47
|
RedMatterAxe/My-Tycoon
|
https://api.github.com/repos/RedMatterAxe/My-Tycoon
|
closed
|
Player should be able to move on from Story manually
|
bug Fix Priority: MEDIUM
|
Player should be able to move on from Story manually rather than it being on a timer.
|
1.0
|
Player should be able to move on from Story manually - Player should be able to move on from Story manually rather than it being on a timer.
|
non_process
|
player should be able to move on from story manually player should be able to move on from story manually rather than it being on a timer
| 0
|
9,933
| 12,969,983,067
|
IssuesEvent
|
2020-07-21 08:38:30
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Switch Client to use Unix Domain Sockets
|
kind/feature process/candidate topic: internal topic: performance
|
Now that [we switched our HTTP client to undici](https://github.com/prisma/prisma/issues/2890) we can also start supporting UDS.
This should be able to be controlled with a flag to Prisma Client, so we can first test it in isolation and then later switch back to the old implementation to compare the performance or allow people to fall back to the proven implementation if there happens to be an incompatibility or problem. (The parameter does not have to be documented and officially be part of the API).
|
1.0
|
Switch Client to use Unix Domain Sockets - Now that [we switched our HTTP client to undici](https://github.com/prisma/prisma/issues/2890) we can also start supporting UDS.
This should be able to be controlled with a flag to Prisma Client, so we can first test it in isolation and then later switch back to the old implementation to compare the performance or allow people to fall back to the proven implementation if there happens to be an incompatibility or problem. (The parameter does not have to be documented and officially be part of the API).
|
process
|
switch client to use unix domain sockets now that we can also start supporting uds this should be able to be controlled with a flag to prisma client so we can first test it in isolation and then later switch back to the old implementation to compare the performance or allow people to fall back to the proven implementation if there happens to be an incompatibility or problem the parameter does not have to be documented and officially be part of the api
| 1
|
16,313
| 20,968,369,515
|
IssuesEvent
|
2022-03-28 09:02:34
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
Error: "TypeError: window.location.toString is not a function"
|
TYPE: bug AREA: client SYSTEM: event simulation FREQUENCY: level 2 SYSTEM: iframe processing
|
The issue has been reproduced here:
https://github.com/jsnanigans/hammerhead_testcafe_error
### What is your Test Scenario?
Speculation: A iframe is focused, the iframe sends a message to the main window, the main window closes the iframe before testcafe can focus the main window.
> A Iframe is opened where the user enters some data and clicks on submit, then the iframe is redirected with 307 to where some javascript sends a postMessage to our host window which closes the iframe. After the iframe is closed and this error occurs something in the redux dispatch or in the react core breaks and the state is not updated/the app is not re-rendered.
> After some debugging we found that it might have something to do with hammerhead.js internals and some focus method.
> Important things to note here are that this happens if the iframe has a src (like in this example) or has no src attribute (like in our application)
### What is the Current behavior?
error is thrown and prevents react from rendering
### What is the Expected behavior?
the main window is focused in testcafe and continues as usual
### What is your web application and your TestCafe test code?
The issue has been reproduced here:
https://github.com/jsnanigans/hammerhead_testcafe_error
### Steps to Reproduce:
1. clone repo: https://github.com/jsnanigans/hammerhead_testcafe_error
1. run npm install
1. run npm start to start the local server on port 3000.
1. run testcafe -c 1 chrome tests/Sample.ts --skip-js-errors to start testcafe.
1. when the test stops for debugging, open the console in the browser, there you will see the error.
### Your Environment details:
* testcafe version: 1.6.1
* node.js version: v12.11.1
* command-line arguments: `testcafe -c 1 chrome tests/Sample.ts --skip-js-errors`
* browser name and version: Google Chrome Version 78.0.3904.87 (Official Build) (64-bit)
* platform and version: macOS 10.14.6 (18G103)
|
1.0
|
Error: "TypeError: window.location.toString is not a function" - The issue has been reproduced here:
https://github.com/jsnanigans/hammerhead_testcafe_error
### What is your Test Scenario?
Speculation: A iframe is focused, the iframe sends a message to the main window, the main window closes the iframe before testcafe can focus the main window.
> A Iframe is opened where the user enters some data and clicks on submit, then the iframe is redirected with 307 to where some javascript sends a postMessage to our host window which closes the iframe. After the iframe is closed and this error occurs something in the redux dispatch or in the react core breaks and the state is not updated/the app is not re-rendered.
> After some debugging we found that it might have something to do with hammerhead.js internals and some focus method.
> Important things to note here are that this happens if the iframe has a src (like in this example) or has no src attribute (like in our application)
### What is the Current behavior?
error is thrown and prevents react from rendering
### What is the Expected behavior?
the main window is focused in testcafe and continues as usual
### What is your web application and your TestCafe test code?
The issue has been reproduced here:
https://github.com/jsnanigans/hammerhead_testcafe_error
### Steps to Reproduce:
1. clone repo: https://github.com/jsnanigans/hammerhead_testcafe_error
1. run npm install
1. run npm start to start the local server on port 3000.
1. run testcafe -c 1 chrome tests/Sample.ts --skip-js-errors to start testcafe.
1. when the test stops for debugging, open the console in the browser, there you will see the error.
### Your Environment details:
* testcafe version: 1.6.1
* node.js version: v12.11.1
* command-line arguments: `testcafe -c 1 chrome tests/Sample.ts --skip-js-errors`
* browser name and version: Google Chrome Version 78.0.3904.87 (Official Build) (64-bit)
* platform and version: macOS 10.14.6 (18G103)
|
process
|
error typeerror window location tostring is not a function the issue has been reproduced here what is your test scenario speculation a iframe is focused the iframe sends a message to the main window the main window closes the iframe before testcafe can focus the main window a iframe is opened where the user enters some data and clicks on submit then the iframe is redirected with to where some javascript sends a postmessage to our host window which closes the iframe after the iframe is closed and this error occurs something in the redux dispatch or in the react core breaks and the state is not updated the app is not re rendered after some debugging we found that it might have something to do with hammerhead js internals and some focus method important things to note here are that this happens if the iframe has a src like in this example or has no src attribute like in our application what is the current behavior error is thrown and prevents react from rendering what is the expected behavior the main window is focused in testcafe and continues as usual what is your web application and your testcafe test code the issue has been reproduced here steps to reproduce clone repo run npm install run npm start to start the local server on port run testcafe c chrome tests sample ts skip js errors to start testcafe when the test stops for debugging open the console in the browser there you will see the error your environment details testcafe version node js version command line arguments testcafe c chrome tests sample ts skip js errors browser name and version google chrome version official build bit platform and version macos
| 1
|
9,309
| 12,322,752,013
|
IssuesEvent
|
2020-05-13 10:54:24
|
Torbjornsson/DATX05-Master_Thesis
|
https://api.github.com/repos/Torbjornsson/DATX05-Master_Thesis
|
closed
|
Process/Testing
|
Section: Process
|
- [x] Technical solutions (Hangouts etc.)
- [x] Planning the tests
- [x] Consent form
- [x] Changes from plan bcz corona
- [x] The final test plan
- [x] "Pilot"-testing
- [x] How the tests went
- [x] Possible outliers (?)
|
1.0
|
Process/Testing - - [x] Technical solutions (Hangouts etc.)
- [x] Planning the tests
- [x] Consent form
- [x] Changes from plan bcz corona
- [x] The final test plan
- [x] "Pilot"-testing
- [x] How the tests went
- [x] Possible outliers (?)
|
process
|
process testing technical solutions hangouts etc planning the tests consent form changes from plan bcz corona the final test plan pilot testing how the tests went possible outliers
| 1
|
19,379
| 25,516,409,375
|
IssuesEvent
|
2022-11-28 16:40:22
|
carbon-design-system/carbon-platform
|
https://api.github.com/repos/carbon-design-system/carbon-platform
|
opened
|
rmdx: track usages of column offset and/or span
|
role: dev 🤖 type: enhancement 💡 service: rmdx-processing 🖨️
|
**Summary**
We don't want to support rmdx `Column` spans and offsets long-term. To do this, we need to be aware of when they are used. We should log a warning log when one is encountered and set up a notification to the slack channel for it.
|
1.0
|
rmdx: track usages of column offset and/or span - **Summary**
We don't want to support rmdx `Column` spans and offsets long-term. To do this, we need to be aware of when they are used. We should log a warning log when one is encountered and set up a notification to the slack channel for it.
|
process
|
rmdx track usages of column offset and or span summary we don t want to support rmdx column spans and offsets long term to do this we need to be aware of when they are used we should log a warning log when one is encountered and set up a notification to the slack channel for it
| 1
|
18,312
| 24,424,050,254
|
IssuesEvent
|
2022-10-05 23:56:22
|
googleapis/gapic-generator-python
|
https://api.github.com/repos/googleapis/gapic-generator-python
|
opened
|
test `x-goog-request-params` over REST
|
type: process priority: p2
|
If the new annotation `google.api.routing` ([AIP 4222](https://google.aip.dev/client-libraries/4222)) is used ([example](https://github.com/googleapis/googleapis/blob/7b5a467b978ff2dde6cd34717ebad5728d05f2bb/google/bigtable/v2/bigtable.proto#L66)), then I believe we need to include `x-goog-request-params`, with a value as specified in that annotation, over REST.
If no `google.api.routing` are presented, then it sounds like we could omit `x-goog-request-params` because the information would duplicate the URL.
I'm not sure yet whether the Google use cases make this a P1. More discussion in the internal email thread.
This issue is split off from #1444, which now focuses on the gRPC use case (though that fix will also be relevant to the implementation here).
|
1.0
|
test `x-goog-request-params` over REST - If the new annotation `google.api.routing` ([AIP 4222](https://google.aip.dev/client-libraries/4222)) is used ([example](https://github.com/googleapis/googleapis/blob/7b5a467b978ff2dde6cd34717ebad5728d05f2bb/google/bigtable/v2/bigtable.proto#L66)), then I believe we need to include `x-goog-request-params`, with a value as specified in that annotation, over REST.
If no `google.api.routing` are presented, then it sounds like we could omit `x-goog-request-params` because the information would duplicate the URL.
I'm not sure yet whether the Google use cases make this a P1. More discussion in the internal email thread.
This issue is split off from #1444, which now focuses on the gRPC use case (though that fix will also be relevant to the implementation here).
|
process
|
test x goog request params over rest if the new annotation google api routing is used then i believe we need to include x goog request params with a value as specified in that annotation over rest if no google api routing are presented then it sounds like we could omit x goog request params because the information would duplicate the url i m not sure yet whether the google use cases make this a more discussion in the internal email thread this issue is split off from which now focuses on the grpc use case though that fix will also be relevant to the implementation here
| 1
|
1,008
| 3,475,340,717
|
IssuesEvent
|
2015-12-25 14:27:02
|
NuCivic/dkan
|
https://api.github.com/repos/NuCivic/dkan
|
closed
|
Bug: Dataset search includes non-dataset content types
|
bug Processed/JIRA
|
## Description
Dataset search includes all content types.
## Steps to Reproduce
1. Create a Data Dashboard
2. Go to /datasets
3. See it listed
###
Search API's new bundle filter seems to fail randomly. A solution would be to update the search index to index content type dataset view to only allow datasets
|
1.0
|
Bug: Dataset search includes non-dataset content types - ## Description
Dataset search includes all content types.
## Steps to Reproduce
1. Create a Data Dashboard
2. Go to /datasets
3. See it listed
###
Search API's new bundle filter seems to fail randomly. A solution would be to update the search index to index content type dataset view to only allow datasets
|
process
|
bug dataset search includes non dataset content types description dataset search includes all content types steps to reproduce create a data dashboard go to datasets see it listed search api s new bundle filter seems to fail randomly a solution would be to update the search index to index content type dataset view to only allow datasets
| 1
|
17,193
| 22,772,380,380
|
IssuesEvent
|
2022-07-08 11:17:12
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR 'suppression by virus of host interferon-mediated signaling pathway'
|
New term request multi-species process
|
Hello,
The @geneontology/multiorganism-working-group would like to create a sibling term to 'GO:0039502 suppression by virus of host type I interferon-mediated signaling pathway', suppression by virus of host type I Iinterferon-mediated signaling pathway'
and a parent term, 'suppression by virus of host interferon-mediated signaling pathway'
Thanks, Pascale
|
1.0
|
NTR 'suppression by virus of host interferon-mediated signaling pathway' - Hello,
The @geneontology/multiorganism-working-group would like to create a sibling term to 'GO:0039502 suppression by virus of host type I interferon-mediated signaling pathway', suppression by virus of host type I Iinterferon-mediated signaling pathway'
and a parent term, 'suppression by virus of host interferon-mediated signaling pathway'
Thanks, Pascale
|
process
|
ntr suppression by virus of host interferon mediated signaling pathway hello the geneontology multiorganism working group would like to create a sibling term to go suppression by virus of host type i interferon mediated signaling pathway suppression by virus of host type i iinterferon mediated signaling pathway and a parent term suppression by virus of host interferon mediated signaling pathway thanks pascale
| 1
|
9,291
| 12,306,079,316
|
IssuesEvent
|
2020-05-12 00:20:14
|
kubeflow/testing
|
https://api.github.com/repos/kubeflow/testing
|
opened
|
NFS is full - tests are failing
|
area/engprod kind/process priority/p0
|
NFS is full.
@Jeffwan said he would take care of this once we grant him permission;
PR to add him to the ci-team is pending in kubeflow/internal-acls#242
|
1.0
|
NFS is full - tests are failing - NFS is full.
@Jeffwan said he would take care of this once we grant him permission;
PR to add him to the ci-team is pending in kubeflow/internal-acls#242
|
process
|
nfs is full tests are failing nfs is full jeffwan said he would take care of this once we grant him permission pr to add him to the ci team is pending in kubeflow internal acls
| 1
|
7,748
| 10,864,298,898
|
IssuesEvent
|
2019-11-14 16:36:49
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[FEATURE][processing] New algorithm "Overlap analysis"
|
3.8 Automatic new feature Easy Processing Alg
|
Original commit: https://github.com/qgis/QGIS/commit/2ec429cb890242d29e04908fa0eabe74bbb3a899 by nyalldawson
This algorithm calculates the area and percentage cover
by which features from an input layer are overlapped by
features from a selection of overlay layers.
New attributes are added to the output layer reporting
the total area of overlap and percentage of the input
feature overlapped by each of the selected overlay layers.
This is quite a common GIS task request, yet is full
of traps for inexperienced users, and the amount of
manual data work usually done by users to calculate
these figures can often lead to mistakes and inaccurate
results. It's nice to have a robust, fast, inbuilt
algorithm which allows this task to be done in a
single step without risk of human error.
|
1.0
|
[FEATURE][processing] New algorithm "Overlap analysis" - Original commit: https://github.com/qgis/QGIS/commit/2ec429cb890242d29e04908fa0eabe74bbb3a899 by nyalldawson
This algorithm calculates the area and percentage cover
by which features from an input layer are overlapped by
features from a selection of overlay layers.
New attributes are added to the output layer reporting
the total area of overlap and percentage of the input
feature overlapped by each of the selected overlay layers.
This is quite a common GIS task request, yet is full
of traps for inexperienced users, and the amount of
manual data work usually done by users to calculate
these figures can often lead to mistakes and inaccurate
results. It's nice to have a robust, fast, inbuilt
algorithm which allows this task to be done in a
single step without risk of human error.
|
process
|
new algorithm overlap analysis original commit by nyalldawson this algorithm calculates the area and percentage cover by which features from an input layer are overlapped by features from a selection of overlay layers new attributes are added to the output layer reporting the total area of overlap and percentage of the input feature overlapped by each of the selected overlay layers this is quite a common gis task request yet is full of traps for inexperienced users and the amount of manual data work usually done by users to calculate these figures can often lead to mistakes and inaccurate results it s nice to have a robust fast inbuilt algorithm which allows this task to be done in a single step without risk of human error
| 1
|
7,094
| 10,241,033,363
|
IssuesEvent
|
2019-08-19 22:36:49
|
toggl/mobileapp
|
https://api.github.com/repos/toggl/mobileapp
|
closed
|
Update docs/localization.md to reflect usage of Resources.resx & generalized text bindings
|
process
|
The part from Android works describe on #4142 works only for Xamarin.Forms, we've got to update the docs, explaining that we are supposed to set text similarly to what is done on iOS.
|
1.0
|
Update docs/localization.md to reflect usage of Resources.resx & generalized text bindings - The part from Android works describe on #4142 works only for Xamarin.Forms, we've got to update the docs, explaining that we are supposed to set text similarly to what is done on iOS.
|
process
|
update docs localization md to reflect usage of resources resx generalized text bindings the part from android works describe on works only for xamarin forms we ve got to update the docs explaining that we are supposed to set text similarly to what is done on ios
| 1
|
604
| 3,074,788,328
|
IssuesEvent
|
2015-08-20 09:36:40
|
sysown/proxysql-0.2
|
https://api.github.com/repos/sysown/proxysql-0.2
|
opened
|
Reduce memory footprint
|
CONNECTION POOL enhancement MYSQL QUERY PROCESSOR STATISTICS
|
## Why
While running benchmark, calls to malloc() and free() are often the bottleneck
## What
* [ ] reduce the amount of memory that is copied between client and server MySQL_Data_Stream
* [ ] reduce the amount of memory that is copied between server MySQL_Data_Stream and MySQL_Connection
* [ ] reduce the amount of memory that is copied to Query_Processor for query statistics
|
1.0
|
Reduce memory footprint - ## Why
While running benchmark, calls to malloc() and free() are often the bottleneck
## What
* [ ] reduce the amount of memory that is copied between client and server MySQL_Data_Stream
* [ ] reduce the amount of memory that is copied between server MySQL_Data_Stream and MySQL_Connection
* [ ] reduce the amount of memory that is copied to Query_Processor for query statistics
|
process
|
reduce memory footprint why while running benchmark calls to malloc and free are often the bottleneck what reduce the amount of memory that is copied between client and server mysql data stream reduce the amount of memory that is copied between server mysql data stream and mysql connection reduce the amount of memory that is copied to query processor for query statistics
| 1
|
70,540
| 7,190,583,905
|
IssuesEvent
|
2018-02-02 17:46:15
|
Esri/solutions-geoprocessing-toolbox
|
https://api.github.com/repos/Esri/solutions-geoprocessing-toolbox
|
closed
|
Build a script that allows users to pull out their templates tools
|
A-feature E-invalid T - unit test
|
**Problem:**
Right now the engineer must manually copy all of the parts of a GP tool for deployment in a template. This means the engineer must know which files belong with the toolbox. Sometimes the required files are not obvious (imported PY modules, output LYRX files, tooldata, etc.). This means the engineer may not be copying the right files, or copying the wrong ones. This means the toolbox will either not function, or will be bloated with extra files. What we need is something that will allow the engineer to deploy just one toolbox and all of it's required files from the entire repository.
**How:**
Each toolbox will have an associated INI file that acts as an inventory file, listing all of the parts that go with the toolbox.
This would include TBX/PYT toolbox, PY scripts, LYR/LYRX layer files, tool data, etc. From one centralized "deployment" script the inventory file would be read, the pieces found, and placed into a target folder. The inventory/deployment would exclude scratch data, intermediate data, test files, etc.
**What:**
The deployment would contain the following parts
- A main deployment script (Python) in .\solutions-geoprocessing-toolbox\utils\deploy
- A <template>Inventory.ini file for each toolbox, or template .\solutions-geoprocessing-toolbox\utils\deploy\inventories
- A unit test for the deployment script, that would 'deploy' several example templates to an intermediate folder, then check the contents against the inventory files.
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/129342386639048/152131732600120)
|
1.0
|
Build a script that allows users to pull out their templates tools - **Problem:**
Right now the engineer must manually copy all of the parts of a GP tool for deployment in a template. This means the engineer must know which files belong with the toolbox. Sometimes the required files are not obvious (imported PY modules, output LYRX files, tooldata, etc.). This means the engineer may not be copying the right files, or copying the wrong ones. This means the toolbox will either not function, or will be bloated with extra files. What we need is something that will allow the engineer to deploy just one toolbox and all of it's required files from the entire repository.
**How:**
Each toolbox will have an associated INI file that acts as an inventory file, listing all of the parts that go with the toolbox.
This would include TBX/PYT toolbox, PY scripts, LYR/LYRX layer files, tool data, etc. From one centralized "deployment" script the inventory file would be read, the pieces found, and placed into a target folder. The inventory/deployment would exclude scratch data, intermediate data, test files, etc.
**What:**
The deployment would contain the following parts
- A main deployment script (Python) in .\solutions-geoprocessing-toolbox\utils\deploy
- A <template>Inventory.ini file for each toolbox, or template .\solutions-geoprocessing-toolbox\utils\deploy\inventories
- A unit test for the deployment script, that would 'deploy' several example templates to an intermediate folder, then check the contents against the inventory files.
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/129342386639048/152131732600120)
|
non_process
|
build a script that allows users to pull out their templates tools problem right now the engineer must manually copy all of the parts of a gp tool for deployment in a template this means the engineer must know which files belong with the toolbox sometimes the required files are not obvious imported py modules output lyrx files tooldata etc this means the engineer may not be copying the right files or copying the wrong ones this means the toolbox will either not function or will be bloated with extra files what we need is something that will allow the engineer to deploy just one toolbox and all of it s required files from the entire repository how each toolbox will have an associated ini file that acts as an inventory file listing all of the parts that go with the toolbox this would include tbx pyt toolbox py scripts lyr lyrx layer files tool data etc from one centralized deployment script the inventory file would be read the pieces found and placed into a target folder the inventory deployment would exclude scratch data intermediate data test files etc what the deployment would contain the following parts a main deployment script python in solutions geoprocessing toolbox utils deploy a inventory ini file for each toolbox or template solutions geoprocessing toolbox utils deploy inventories a unit test for the deployment script that would deploy several example templates to an intermediate folder then check the contents against the inventory files ┆issue is synchronized with this
| 0
|
538,879
| 15,780,278,276
|
IssuesEvent
|
2021-04-01 09:45:23
|
geosolutions-it/MapStore2
|
https://api.github.com/repos/geosolutions-it/MapStore2
|
opened
|
The first GFI result visualized in Identify panel is not the expected one
|
C125-2020-AUSTROCONTROL-Map2Imp Priority: High bug
|
## Description
<!-- Add here a few sentences describing the bug. -->
We agreed on the solution, that the first element that is received will be selected initially. This is the case so far. Nevertheless as soon as all elements are loaded, the first item in the drop down list is automatically selected, leading to a sudden change in the info panel, in case the first element in list wasn’t the first element to be loaded. We would expect that the first loaded feature stays selected, until the user changes it.
## How to reproduce
<!-- A list of steps to reproduce the bug -->

*Expected Result*
<!-- Describe here the expected result -->
We would expect that the first loaded feature stays selected, until the user changes it.
*Current Result*
<!-- Describe here the current behavior -->
The first item in the drop down list is automatically selected even if it is not related to the first GFI respose loaded.
- [x] Not browser related
<details><summary> <b>Browser info</b> </summary>
<!-- If browser related, please compile the following table -->
<!-- If your browser is not in the list please add a new row to the table with the version -->
(use this site: <a href="https://www.whatsmybrowser.org/">https://www.whatsmybrowser.org/</a> for non expert users)
| Browser Affected | Version |
|---|---|
|Internet Explorer| |
|Edge| |
|Chrome| |
|Firefox| |
|Safari| |
</details>
## Other useful information
<!-- error stack trace, screenshot, videos, or link to repository code are welcome -->
|
1.0
|
The first GFI result visualized in Identify panel is not the expected one - ## Description
<!-- Add here a few sentences describing the bug. -->
We agreed on the solution, that the first element that is received will be selected initially. This is the case so far. Nevertheless as soon as all elements are loaded, the first item in the drop down list is automatically selected, leading to a sudden change in the info panel, in case the first element in list wasn’t the first element to be loaded. We would expect that the first loaded feature stays selected, until the user changes it.
## How to reproduce
<!-- A list of steps to reproduce the bug -->

*Expected Result*
<!-- Describe here the expected result -->
We would expect that the first loaded feature stays selected, until the user changes it.
*Current Result*
<!-- Describe here the current behavior -->
The first item in the drop down list is automatically selected even if it is not related to the first GFI respose loaded.
- [x] Not browser related
<details><summary> <b>Browser info</b> </summary>
<!-- If browser related, please compile the following table -->
<!-- If your browser is not in the list please add a new row to the table with the version -->
(use this site: <a href="https://www.whatsmybrowser.org/">https://www.whatsmybrowser.org/</a> for non expert users)
| Browser Affected | Version |
|---|---|
|Internet Explorer| |
|Edge| |
|Chrome| |
|Firefox| |
|Safari| |
</details>
## Other useful information
<!-- error stack trace, screenshot, videos, or link to repository code are welcome -->
|
non_process
|
the first gfi result visualized in identify panel is not the expected one description we agreed on the solution that the first element that is received will be selected initially this is the case so far nevertheless as soon as all elements are loaded the first item in the drop down list is automatically selected leading to a sudden change in the info panel in case the first element in list wasn’t the first element to be loaded we would expect that the first loaded feature stays selected until the user changes it how to reproduce expected result we would expect that the first loaded feature stays selected until the user changes it current result the first item in the drop down list is automatically selected even if it is not related to the first gfi respose loaded not browser related browser info use this site a href for non expert users browser affected version internet explorer edge chrome firefox safari other useful information
| 0
|
13,671
| 16,359,979,430
|
IssuesEvent
|
2021-05-14 07:56:36
|
crash1115/5e-training
|
https://api.github.com/repos/crash1115/5e-training
|
closed
|
Compatibility with Dice Tooltip.
|
Compatibility Request
|
Having this mod activated together with Dice Tooltip will still work, but when you put your mouse on the activity name, or the dice icon, you'll get errors in the console, because it seems to think that it's a weapon or something.
With the error coming up by hovering over both the icon and the name, the console quickly gets littered with errors.
Here is the error I'm getting.
> dicetooltip.js:171 Uncaught TypeError: Cannot read property 'hasAttack' of null
> at checkItemTooltip (dicetooltip.js:171)
> at HTMLDivElement.mouseenter (dicetooltip.js:54)
> at HTMLDivElement.handle (jquery.min.js:2)
> at HTMLDivElement.dispatch (jquery.min.js:2)
> at HTMLDivElement.v.handle (jquery.min.js:2)
So it might be an idea to check that up. :)
Other than that, thanks for a nice, practical and very helpful module! :)
|
True
|
Compatibility with Dice Tooltip. - Having this mod activated together with Dice Tooltip will still work, but when you put your mouse on the activity name, or the dice icon, you'll get errors in the console, because it seems to think that it's a weapon or something.
With the error coming up by hovering over both the icon and the name, the console quickly gets littered with errors.
Here is the error I'm getting.
> dicetooltip.js:171 Uncaught TypeError: Cannot read property 'hasAttack' of null
> at checkItemTooltip (dicetooltip.js:171)
> at HTMLDivElement.mouseenter (dicetooltip.js:54)
> at HTMLDivElement.handle (jquery.min.js:2)
> at HTMLDivElement.dispatch (jquery.min.js:2)
> at HTMLDivElement.v.handle (jquery.min.js:2)
So it might be an idea to check that up. :)
Other than that, thanks for a nice, practical and very helpful module! :)
|
non_process
|
compatibility with dice tooltip having this mod activated together with dice tooltip will still work but when you put your mouse on the activity name or the dice icon you ll get errors in the console because it seems to think that it s a weapon or something with the error coming up by hovering over both the icon and the name the console quickly gets littered with errors here is the error i m getting dicetooltip js uncaught typeerror cannot read property hasattack of null at checkitemtooltip dicetooltip js at htmldivelement mouseenter dicetooltip js at htmldivelement handle jquery min js at htmldivelement dispatch jquery min js at htmldivelement v handle jquery min js so it might be an idea to check that up other than that thanks for a nice practical and very helpful module
| 0
|
64,470
| 14,666,132,119
|
IssuesEvent
|
2020-12-29 15:40:16
|
jgeraigery/experian-java
|
https://api.github.com/repos/jgeraigery/experian-java
|
opened
|
CVE-2019-12814 (Medium) detected in jackson-databind-2.9.2.jar
|
security vulnerability
|
## CVE-2019-12814 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.2.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: experian-java/MavenWorkspace/bis-services-lib/bis-services-base/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.2/jackson-databind-2.9.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/experian-java/commit/d89dcb23dbf81afc230b102b366ac005def1fe39">d89dcb23dbf81afc230b102b366ac005def1fe39</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.
<p>Publish Date: 2019-06-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814>CVE-2019-12814</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2341">https://github.com/FasterXML/jackson-databind/issues/2341</a></p>
<p>Release Date: 2019-06-19</p>
<p>Fix Resolution: 2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.2","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0"}],"vulnerabilityIdentifier":"CVE-2019-12814","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2019-12814 (Medium) detected in jackson-databind-2.9.2.jar - ## CVE-2019-12814 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.2.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: experian-java/MavenWorkspace/bis-services-lib/bis-services-base/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.2/jackson-databind-2.9.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/experian-java/commit/d89dcb23dbf81afc230b102b366ac005def1fe39">d89dcb23dbf81afc230b102b366ac005def1fe39</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.
<p>Publish Date: 2019-06-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814>CVE-2019-12814</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2341">https://github.com/FasterXML/jackson-databind/issues/2341</a></p>
<p>Release Date: 2019-06-19</p>
<p>Fix Resolution: 2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.2","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0"}],"vulnerabilityIdentifier":"CVE-2019-12814","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jackson databind jar cve medium severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file experian java mavenworkspace bis services lib bis services base pom xml path to vulnerable library canner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind x through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has jdom x or x jar in the classpath an attacker can send a specifically crafted json message that allows them to read arbitrary local files on the server publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a polymorphic typing issue was discovered in fasterxml jackson databind x through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has jdom x or x jar in the classpath an attacker can send a specifically crafted json message that allows them to read arbitrary local files on the server vulnerabilityurl
| 0
|
6,969
| 10,119,975,973
|
IssuesEvent
|
2019-07-31 12:47:45
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Synthesis failed for videointelligence
|
api: videointelligence autosynth failure type: process
|
Hello! Autosynth couldn't regenerate videointelligence. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-videointelligence'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:9aed6bbde54e26d2fcde7aa86d9f64c0278f741e58808c46573e488cbf6098f0
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/videointelligence/artman_videointelligence_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta1/video_intelligence.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1/google/cloud/videointelligence_v1beta1/proto/video_intelligence.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1/google/cloud/videointelligence_v1beta1/proto.
synthtool > Replaced 'google-cloud-video-intelligence' in google/cloud/videointelligence_v1beta1/gapic/video_intelligence_service_client.py.
synthtool > Running generator for google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml.
synthtool > Failed executing docker run --name artman-docker --rm -i -e HOST_USER_ID=1000 -e HOST_GROUP_ID=1000 -e RUNNING_IN_ARTMAN_DOCKER=True -v /home/kbuilder/.cache/synthtool/googleapis:/home/kbuilder/.cache/synthtool/googleapis -v /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles:/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles -w /home/kbuilder/.cache/synthtool/googleapis googleapis/artman:latest /bin/bash -c artman --local --config google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml generate python_gapic:
artman> Final args:
artman> api_name: video-intelligence
artman> api_version: v1beta2
artman> artifact_type: GAPIC
artman> aspect: ALL
artman> gapic_code_dir: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta2
artman> gapic_yaml: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2/videointelligence_gapic.yaml
artman> generator_args: null
artman> import_proto_path:
artman> - /home/kbuilder/.cache/synthtool/googleapis
artman> language: python
artman> organization_name: google-cloud
artman> output_dir: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles
artman> proto_deps:
artman> - name: google-common-protos
artman> proto_package: ''
artman> root_dir: /home/kbuilder/.cache/synthtool/googleapis
artman> service_yaml: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml
artman> src_proto_path:
artman> - /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2
artman> toolkit_path: /toolkit
artman>
artman> Creating GapicClientPipeline.
artman.output >
WARNING: toplevel: (lint) control-presence: Service videointelligence.googleapis.com does not have control environment configured.
ERROR: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml:23: Cannot resolve additional TYPE_MESSAGE type 'google.cloud.videointelligence.v1p3beta1.AnnotateVideoProgress' specified in the config.
WARNING: toplevel: (lint) control-presence: Service videointelligence.googleapis.com does not have control environment configured.
ERROR: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml:23: Cannot resolve additional TYPE_MESSAGE type 'google.cloud.videointelligence.v1p3beta1.AnnotateVideoProgress' specified in the config.
artman> Traceback (most recent call last):
File "/artman/artman/cli/main.py", line 72, in main
engine.run()
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 159, in run
for _state in self.run_iter():
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 223, in run_iter
failure.Failure.reraise_if_any(it)
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 292, in reraise_if_any
failures[0].reraise()
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 299, in reraise
six.reraise(*self._exc_info)
File "/usr/local/lib/python3.5/dist-packages/six.py", line 693, in reraise
raise value
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/executor.py", line 82, in _execute_task
result = task.execute(**arguments)
File "/artman/artman/tasks/gapic_tasks.py", line 139, in execute
task_utils.gapic_gen_task(toolkit_path, [gapic_artifact] + args))
File "/artman/artman/tasks/task_base.py", line 64, in exec_command
raise e
File "/artman/artman/tasks/task_base.py", line 56, in exec_command
output = subprocess.check_output(args, stderr=subprocess.STDOUT)
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 708, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['java', '-cp', '/toolkit/build/libs/gapic-generator-latest-fatjar.jar', 'com.google.api.codegen.GeneratorMain', 'LEGACY_GAPIC_AND_PACKAGE', '--descriptor_set=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/google-cloud-video-intelligence-v1beta2_updated_py_docs.desc', '--package_yaml2=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python_google-cloud-video-intelligence-v1beta2_package2.yaml', '--output=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta2', '--language=python', '--service_yaml=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml', '--gapic_yaml=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2/videointelligence_gapic.yaml']' returned non-zero exit status 1
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py", line 34, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 44, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 125, in _generate_code
generator_args=generator_args,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/artman.py", line 141, in run
shell.run(cmd, cwd=root_dir)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--name', 'artman-docker', '--rm', '-i', '-e', 'HOST_USER_ID=1000', '-e', 'HOST_GROUP_ID=1000', '-e', 'RUNNING_IN_ARTMAN_DOCKER=True', '-v', '/home/kbuilder/.cache/synthtool/googleapis:/home/kbuilder/.cache/synthtool/googleapis', '-v', '/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles:/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles', '-w', PosixPath('/home/kbuilder/.cache/synthtool/googleapis'), 'googleapis/artman:latest', '/bin/bash', '-c', 'artman --local --config google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml generate python_gapic']' returned non-zero exit status 32.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/176c03a2-cc54-43e2-8a4f-d7ab06a38862).
|
1.0
|
Synthesis failed for videointelligence - Hello! Autosynth couldn't regenerate videointelligence. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-videointelligence'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:9aed6bbde54e26d2fcde7aa86d9f64c0278f741e58808c46573e488cbf6098f0
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/videointelligence/artman_videointelligence_v1beta1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta1/video_intelligence.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1/google/cloud/videointelligence_v1beta1/proto/video_intelligence.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta1/google/cloud/videointelligence_v1beta1/proto.
synthtool > Replaced 'google-cloud-video-intelligence' in google/cloud/videointelligence_v1beta1/gapic/video_intelligence_service_client.py.
synthtool > Running generator for google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml.
synthtool > Failed executing docker run --name artman-docker --rm -i -e HOST_USER_ID=1000 -e HOST_GROUP_ID=1000 -e RUNNING_IN_ARTMAN_DOCKER=True -v /home/kbuilder/.cache/synthtool/googleapis:/home/kbuilder/.cache/synthtool/googleapis -v /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles:/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles -w /home/kbuilder/.cache/synthtool/googleapis googleapis/artman:latest /bin/bash -c artman --local --config google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml generate python_gapic:
artman> Final args:
artman> api_name: video-intelligence
artman> api_version: v1beta2
artman> artifact_type: GAPIC
artman> aspect: ALL
artman> gapic_code_dir: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta2
artman> gapic_yaml: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2/videointelligence_gapic.yaml
artman> generator_args: null
artman> import_proto_path:
artman> - /home/kbuilder/.cache/synthtool/googleapis
artman> language: python
artman> organization_name: google-cloud
artman> output_dir: /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles
artman> proto_deps:
artman> - name: google-common-protos
artman> proto_package: ''
artman> root_dir: /home/kbuilder/.cache/synthtool/googleapis
artman> service_yaml: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml
artman> src_proto_path:
artman> - /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2
artman> toolkit_path: /toolkit
artman>
artman> Creating GapicClientPipeline.
artman.output >
WARNING: toplevel: (lint) control-presence: Service videointelligence.googleapis.com does not have control environment configured.
ERROR: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml:23: Cannot resolve additional TYPE_MESSAGE type 'google.cloud.videointelligence.v1p3beta1.AnnotateVideoProgress' specified in the config.
WARNING: toplevel: (lint) control-presence: Service videointelligence.googleapis.com does not have control environment configured.
ERROR: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml:23: Cannot resolve additional TYPE_MESSAGE type 'google.cloud.videointelligence.v1p3beta1.AnnotateVideoProgress' specified in the config.
artman> Traceback (most recent call last):
File "/artman/artman/cli/main.py", line 72, in main
engine.run()
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 159, in run
for _state in self.run_iter():
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/engine.py", line 223, in run_iter
failure.Failure.reraise_if_any(it)
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 292, in reraise_if_any
failures[0].reraise()
File "/usr/local/lib/python3.5/dist-packages/taskflow/types/failure.py", line 299, in reraise
six.reraise(*self._exc_info)
File "/usr/local/lib/python3.5/dist-packages/six.py", line 693, in reraise
raise value
File "/usr/local/lib/python3.5/dist-packages/taskflow/engines/action_engine/executor.py", line 82, in _execute_task
result = task.execute(**arguments)
File "/artman/artman/tasks/gapic_tasks.py", line 139, in execute
task_utils.gapic_gen_task(toolkit_path, [gapic_artifact] + args))
File "/artman/artman/tasks/task_base.py", line 64, in exec_command
raise e
File "/artman/artman/tasks/task_base.py", line 56, in exec_command
output = subprocess.check_output(args, stderr=subprocess.STDOUT)
File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
**kwargs).stdout
File "/usr/lib/python3.5/subprocess.py", line 708, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['java', '-cp', '/toolkit/build/libs/gapic-generator-latest-fatjar.jar', 'com.google.api.codegen.GeneratorMain', 'LEGACY_GAPIC_AND_PACKAGE', '--descriptor_set=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/google-cloud-video-intelligence-v1beta2_updated_py_docs.desc', '--package_yaml2=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python_google-cloud-video-intelligence-v1beta2_package2.yaml', '--output=/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/video-intelligence-v1beta2', '--language=python', '--service_yaml=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/videointelligence_v1beta2.yaml', '--gapic_yaml=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/v1beta2/videointelligence_gapic.yaml']' returned non-zero exit status 1
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py", line 34, in <module>
include_protos=True,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 44, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 125, in _generate_code
generator_args=generator_args,
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/artman.py", line 141, in run
shell.run(cmd, cwd=root_dir)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--name', 'artman-docker', '--rm', '-i', '-e', 'HOST_USER_ID=1000', '-e', 'HOST_GROUP_ID=1000', '-e', 'RUNNING_IN_ARTMAN_DOCKER=True', '-v', '/home/kbuilder/.cache/synthtool/googleapis:/home/kbuilder/.cache/synthtool/googleapis', '-v', '/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles:/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles', '-w', PosixPath('/home/kbuilder/.cache/synthtool/googleapis'), 'googleapis/artman:latest', '/bin/bash', '-c', 'artman --local --config google/cloud/videointelligence/artman_videointelligence_v1beta2.yaml generate python_gapic']' returned non-zero exit status 32.
synthtool > Cleaned up 1 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/176c03a2-cc54-43e2-8a4f-d7ab06a38862).
|
process
|
synthesis failed for videointelligence hello autosynth couldn t regenerate videointelligence broken heart here s the output from running synth py cloning into working repo switched to branch autosynth videointelligence running synthtool synthtool executing tmpfs src git autosynth working repo videointelligence synth py synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis synthtool running generator for google cloud videointelligence artman videointelligence yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python video intelligence synthtool copy home kbuilder cache synthtool googleapis google cloud videointelligence video intelligence proto to home kbuilder cache synthtool googleapis artman genfiles python video intelligence google cloud videointelligence proto video intelligence proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python video intelligence google cloud videointelligence proto synthtool replaced google cloud video intelligence in google cloud videointelligence gapic video intelligence service client py synthtool running generator for google cloud videointelligence artman videointelligence yaml synthtool failed executing docker run name artman docker rm i e host user id e host group id e running in artman docker true v home kbuilder cache synthtool googleapis home kbuilder cache synthtool googleapis v home kbuilder cache synthtool googleapis artman genfiles home kbuilder cache synthtool googleapis artman genfiles w home kbuilder cache synthtool googleapis googleapis artman latest bin bash c artman local config google cloud videointelligence artman videointelligence yaml generate python gapic artman final args artman api name video intelligence artman api version artman artifact type gapic artman aspect all artman gapic code dir home kbuilder cache synthtool googleapis artman genfiles python video intelligence artman gapic yaml home kbuilder cache synthtool googleapis google cloud videointelligence videointelligence gapic yaml artman generator args null artman import proto path artman home kbuilder cache synthtool googleapis artman language python artman organization name google cloud artman output dir home kbuilder cache synthtool googleapis artman genfiles artman proto deps artman name google common protos artman proto package artman root dir home kbuilder cache synthtool googleapis artman service yaml home kbuilder cache synthtool googleapis google cloud videointelligence videointelligence yaml artman src proto path artman home kbuilder cache synthtool googleapis google cloud videointelligence artman toolkit path toolkit artman artman creating gapicclientpipeline artman output warning toplevel lint control presence service videointelligence googleapis com does not have control environment configured error home kbuilder cache synthtool googleapis google cloud videointelligence videointelligence yaml cannot resolve additional type message type google cloud videointelligence annotatevideoprogress specified in the config warning toplevel lint control presence service videointelligence googleapis com does not have control environment configured error home kbuilder cache synthtool googleapis google cloud videointelligence videointelligence yaml cannot resolve additional type message type google cloud videointelligence annotatevideoprogress specified in the config artman traceback most recent call last file artman artman cli main py line in main engine run file usr local lib dist packages taskflow engines action engine engine py line in run for state in self run iter file usr local lib dist packages taskflow engines action engine engine py line in run iter failure failure reraise if any it file usr local lib dist packages taskflow types failure py line in reraise if any failures reraise file usr local lib dist packages taskflow types failure py line in reraise six reraise self exc info file usr local lib dist packages six py line in reraise raise value file usr local lib dist packages taskflow engines action engine executor py line in execute task result task execute arguments file artman artman tasks gapic tasks py line in execute task utils gapic gen task toolkit path args file artman artman tasks task base py line in exec command raise e file artman artman tasks task base py line in exec command output subprocess check output args stderr subprocess stdout file usr lib subprocess py line in check output kwargs stdout file usr lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo videointelligence synth py line in include protos true file tmpfs src git autosynth env lib site packages synthtool gcp gapic generator py line in py library return self generate code service version python kwargs file tmpfs src git autosynth env lib site packages synthtool gcp gapic generator py line in generate code generator args generator args file tmpfs src git autosynth env lib site packages synthtool gcp artman py line in run shell run cmd cwd root dir file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthtool cleaned up temporary directories synthtool wrote metadata to synth metadata synthesis failed google internal developers can see the full log
| 1
|
49,776
| 13,462,586,904
|
IssuesEvent
|
2020-09-09 16:16:45
|
LevyForchh/symphony-java-api
|
https://api.github.com/repos/LevyForchh/symphony-java-api
|
opened
|
CVE-2020-24616 (High) detected in jackson-databind-2.8.4.jar
|
security vulnerability
|
## CVE-2020-24616 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/symphony-java-api/authenticator/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar</p>
<p>
Dependency Hierarchy:
- jackson-datatype-jsr310-2.8.4.jar (Root Library)
- :x: **jackson-databind-2.8.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/LevyForchh/symphony-java-api/commit/0ff7c2c88343428195277225d1875a1c606c1c4b">0ff7c2c88343428195277225d1875a1c606c1c4b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPDataSource (aka Anteros-DBCP).
<p>Publish Date: 2020-08-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24616>CVE-2020-24616</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616</a></p>
<p>Release Date: 2020-08-25</p>
<p>Fix Resolution: 2.9.10.6</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.4","isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.8.4;com.fasterxml.jackson.core:jackson-databind:2.8.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.10.6"}],"vulnerabilityIdentifier":"CVE-2020-24616","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPDataSource (aka Anteros-DBCP).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24616","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-24616 (High) detected in jackson-databind-2.8.4.jar - ## CVE-2020-24616 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/symphony-java-api/authenticator/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.4/jackson-databind-2.8.4.jar</p>
<p>
Dependency Hierarchy:
- jackson-datatype-jsr310-2.8.4.jar (Root Library)
- :x: **jackson-databind-2.8.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/LevyForchh/symphony-java-api/commit/0ff7c2c88343428195277225d1875a1c606c1c4b">0ff7c2c88343428195277225d1875a1c606c1c4b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPDataSource (aka Anteros-DBCP).
<p>Publish Date: 2020-08-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24616>CVE-2020-24616</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616</a></p>
<p>Release Date: 2020-08-25</p>
<p>Fix Resolution: 2.9.10.6</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.4","isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.8.4;com.fasterxml.jackson.core:jackson-databind:2.8.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.10.6"}],"vulnerabilityIdentifier":"CVE-2020-24616","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPDataSource (aka Anteros-DBCP).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24616","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm symphony java api authenticator pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy jackson datatype jar root library x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to br com anteros dbcp anterosdbcpdatasource aka anteros dbcp publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to br com anteros dbcp anterosdbcpdatasource aka anteros dbcp vulnerabilityurl
| 0
|
7,082
| 10,229,898,274
|
IssuesEvent
|
2019-08-17 16:37:40
|
rtcharity/eahub.org
|
https://api.github.com/repos/rtcharity/eahub.org
|
closed
|
Disable smooch
|
Feature Request High Priority Process
|
We currently don't have the capacity to ensure that messages are being answered in time. It would be much better if users had the option to send us an email (without making an appearance that it's going to be answered within minutes)
|
1.0
|
Disable smooch - We currently don't have the capacity to ensure that messages are being answered in time. It would be much better if users had the option to send us an email (without making an appearance that it's going to be answered within minutes)
|
process
|
disable smooch we currently don t have the capacity to ensure that messages are being answered in time it would be much better if users had the option to send us an email without making an appearance that it s going to be answered within minutes
| 1
|
13,141
| 15,558,757,749
|
IssuesEvent
|
2021-03-16 10:40:29
|
nodejs/citgm
|
https://api.github.com/repos/nodejs/citgm
|
closed
|
Add hapi support
|
bug child_process weirdness
|
Moving discussion from https://github.com/rvagg/iojs-smoke-tests/issues/1
I'm +1 on adding Hapi given its heavy test burden and enterprise popularity. Does it need anything more than an entry on the README?
|
1.0
|
Add hapi support - Moving discussion from https://github.com/rvagg/iojs-smoke-tests/issues/1
I'm +1 on adding Hapi given its heavy test burden and enterprise popularity. Does it need anything more than an entry on the README?
|
process
|
add hapi support moving discussion from i m on adding hapi given its heavy test burden and enterprise popularity does it need anything more than an entry on the readme
| 1
|
56,765
| 13,922,824,643
|
IssuesEvent
|
2020-10-21 13:44:15
|
pwa-builder/PWABuilder
|
https://api.github.com/repos/pwa-builder/PWABuilder
|
closed
|
[Visual Requirements - PWA Builder - Home Page]: Luminosity ratio is "1.64:1" which is less than required i.e. 4.5:1 for "Enter a url" placeholder text.
|
A11yCT A11yMAS A11yMediumImpact HCL- PWABuilder HCL-E+D In-PR :rocket: MAS1.4.3 Severity3
|
**User Experience:**
If a minimum luminosity contrast ratio of 4:5:1 is not provided between text and its background, it would be difficult for people with moderately low vision to read the text. Good luminance contrast accommodates diverse color deficiencies and provides contrast that is independent of color perception.
**Test Environment:**
OS: Windows 10 build 19608.1006
Browser: Edge - Anaheim - Version 85.0.545.0 (Official build) dev (64-bit)
URL: https://preview.pwabuilder.com/
Tool: Color Contrast Analyzer
**Repro Steps**
1. Open URL: https://www.preview.pwabuilder.com/ in Edge Anaheim dev browser.
2. Pwabuilder page will open.
3. Navigate to the "Enter a url" edit box.
4. Check the color contrast ratio of the above placeholder using "Color contrast analyzer" tool.
5. Observe the issue.
**Actual Result:**
Luminosity ratio is "1.64:1" which is less than required i.e. 4.5:1 for "Enter a url" placeholder text.
**Expected Result:**
Luminosity ratio should be minimum "4.5:"1 for "Enter a url" placeholder text.
**Note:**
Issue is also repro for the grey text "**PWA Builder was founded by Microsoft as a community guided, open source project to help move PWA adoption forward. Our Privacy Statement**" present at the bottom of the screen. Refer attachment "MAS1.4.3_Homepage_grey text present at the bottom_cca less than reqd".
This above issue is also repro for this Url also:
1.https://preview.pwabuilder.com/?url
2.https://preview.pwabuilder.com/?url=https%3A%2F%2Fwww.msn.com%2Fen-in%2F
**MAS Reference:**
https://microsoft.sharepoint.com/:w:/r/teams/msenable/_layouts/15/WopiFrame.aspx?sourcedoc={a73546c7-765f-489c-b18f-afb659fe99e6}


|
1.0
|
[Visual Requirements - PWA Builder - Home Page]: Luminosity ratio is "1.64:1" which is less than required i.e. 4.5:1 for "Enter a url" placeholder text. - **User Experience:**
If a minimum luminosity contrast ratio of 4:5:1 is not provided between text and its background, it would be difficult for people with moderately low vision to read the text. Good luminance contrast accommodates diverse color deficiencies and provides contrast that is independent of color perception.
**Test Environment:**
OS: Windows 10 build 19608.1006
Browser: Edge - Anaheim - Version 85.0.545.0 (Official build) dev (64-bit)
URL: https://preview.pwabuilder.com/
Tool: Color Contrast Analyzer
**Repro Steps**
1. Open URL: https://www.preview.pwabuilder.com/ in Edge Anaheim dev browser.
2. Pwabuilder page will open.
3. Navigate to the "Enter a url" edit box.
4. Check the color contrast ratio of the above placeholder using "Color contrast analyzer" tool.
5. Observe the issue.
**Actual Result:**
Luminosity ratio is "1.64:1" which is less than required i.e. 4.5:1 for "Enter a url" placeholder text.
**Expected Result:**
Luminosity ratio should be minimum "4.5:"1 for "Enter a url" placeholder text.
**Note:**
Issue is also repro for the grey text "**PWA Builder was founded by Microsoft as a community guided, open source project to help move PWA adoption forward. Our Privacy Statement**" present at the bottom of the screen. Refer attachment "MAS1.4.3_Homepage_grey text present at the bottom_cca less than reqd".
This above issue is also repro for this Url also:
1.https://preview.pwabuilder.com/?url
2.https://preview.pwabuilder.com/?url=https%3A%2F%2Fwww.msn.com%2Fen-in%2F
**MAS Reference:**
https://microsoft.sharepoint.com/:w:/r/teams/msenable/_layouts/15/WopiFrame.aspx?sourcedoc={a73546c7-765f-489c-b18f-afb659fe99e6}


|
non_process
|
luminosity ratio is which is less than required i e for enter a url placeholder text user experience if a minimum luminosity contrast ratio of is not provided between text and its background it would be difficult for people with moderately low vision to read the text good luminance contrast accommodates diverse color deficiencies and provides contrast that is independent of color perception test environment os windows build browser edge anaheim version official build dev bit url tool color contrast analyzer repro steps open url in edge anaheim dev browser pwabuilder page will open navigate to the enter a url edit box check the color contrast ratio of the above placeholder using color contrast analyzer tool observe the issue actual result luminosity ratio is which is less than required i e for enter a url placeholder text expected result luminosity ratio should be minimum for enter a url placeholder text note issue is also repro for the grey text pwa builder was founded by microsoft as a community guided open source project to help move pwa adoption forward our privacy statement present at the bottom of the screen refer attachment homepage grey text present at the bottom cca less than reqd this above issue is also repro for this url also mas reference
| 0
|
35,700
| 5,003,292,856
|
IssuesEvent
|
2016-12-11 20:50:43
|
DynamoRIO/dynamorio
|
https://api.github.com/repos/DynamoRIO/dynamorio
|
closed
|
Mac SYS_sigaction takes a different struct type for the prev action
|
Bug-AppFail Component-Tests Hotlist-Release OpSys-OSX
|
The expanded linux.sigaction test from 53bc931 exposed an error in DR's handling of Mac sigaction: while the new action has the trampoline field, the old action does not -- yes, they are different struct types.
|
1.0
|
Mac SYS_sigaction takes a different struct type for the prev action - The expanded linux.sigaction test from 53bc931 exposed an error in DR's handling of Mac sigaction: while the new action has the trampoline field, the old action does not -- yes, they are different struct types.
|
non_process
|
mac sys sigaction takes a different struct type for the prev action the expanded linux sigaction test from exposed an error in dr s handling of mac sigaction while the new action has the trampoline field the old action does not yes they are different struct types
| 0
|
7,423
| 10,542,964,246
|
IssuesEvent
|
2019-10-02 14:09:51
|
uncrustify/uncrustify
|
https://api.github.com/repos/uncrustify/uncrustify
|
closed
|
It is not possible to exclude the CT_ASM_COLON for option sp_before_square in C asm
|
C and C++11 Preprocessor
|
Please refer to the option sp_before_square from uncrustify.cfg:
```
## Add or remove space before '[' (except '[]')
sp_before_square = remove # ignore/add/remove/force
```
Altough
```
#define SET_STACK(stack) \
do { \
__asm__ __volatile__ ( \
"mov S, %[oper]" \
: \
:[oper] "r" (stack) \
: "S" \
); \
} while(0)
```
is valid C, most code written is with space near the ':['
```
: [oper] "r" (stack)** \
```
So to make **sp_before_square** really useful we should be able to exclude this change
The easiest working diff for this is:
```
git diff ../src/space.cpp
diff --git a/src/space.cpp b/src/space.cpp
index ac967c90..1a1e7288 100644
--- a/src/space.cpp
+++ b/src/space.cpp
@@ -957,6 +957,10 @@ static iarf_e do_space(chunk_t *first, chunk_t *second, int &min_sp)
{
return(IARF_FORCE);
}
+ if (chunk_is_token(first, CT_ASM_COLON))
+ {
+ return(IARF_FORCE);
+ }
log_rule("sp_before_square");
return(options::sp_before_square());
```
But when this is not acceptable an option like
**sp_before_square_exclude_asm** could be added.
|
1.0
|
It is not possible to exclude the CT_ASM_COLON for option sp_before_square in C asm - Please refer to the option sp_before_square from uncrustify.cfg:
```
## Add or remove space before '[' (except '[]')
sp_before_square = remove # ignore/add/remove/force
```
Altough
```
#define SET_STACK(stack) \
do { \
__asm__ __volatile__ ( \
"mov S, %[oper]" \
: \
:[oper] "r" (stack) \
: "S" \
); \
} while(0)
```
is valid C, most code written is with space near the ':['
```
: [oper] "r" (stack)** \
```
So to make **sp_before_square** really useful we should be able to exclude this change
The easiest working diff for this is:
```
git diff ../src/space.cpp
diff --git a/src/space.cpp b/src/space.cpp
index ac967c90..1a1e7288 100644
--- a/src/space.cpp
+++ b/src/space.cpp
@@ -957,6 +957,10 @@ static iarf_e do_space(chunk_t *first, chunk_t *second, int &min_sp)
{
return(IARF_FORCE);
}
+ if (chunk_is_token(first, CT_ASM_COLON))
+ {
+ return(IARF_FORCE);
+ }
log_rule("sp_before_square");
return(options::sp_before_square());
```
But when this is not acceptable an option like
**sp_before_square_exclude_asm** could be added.
|
process
|
it is not possible to exclude the ct asm colon for option sp before square in c asm please refer to the option sp before square from uncrustify cfg add or remove space before sp before square remove ignore add remove force altough define set stack stack do asm volatile mov s r stack s while is valid c most code written is with space near the r stack so to make sp before square really useful we should be able to exclude this change the easiest working diff for this is git diff src space cpp diff git a src space cpp b src space cpp index a src space cpp b src space cpp static iarf e do space chunk t first chunk t second int min sp return iarf force if chunk is token first ct asm colon return iarf force log rule sp before square return options sp before square but when this is not acceptable an option like sp before square exclude asm could be added
| 1
|
213,843
| 16,540,815,177
|
IssuesEvent
|
2021-05-27 16:33:32
|
napari/napari
|
https://api.github.com/repos/napari/napari
|
opened
|
pre color test fail from numpy 1.21
|
bug tests
|
## 🐛 Bug
Our pre-tests are currently failing - it's a fairly innocuous fail, so no need to rush a release, but we should fix them. The test is passing an intentionally invalid `color = ('a', 1, 1, 1)` which is now raising a warning that didn't used to happen. I'm not quite sure how we want to fix, I can see many different ways - but given that no-one should really pass a color with a string in it as long as we do something reasonable I don't mind
```python-traceback
_________________________ test_invalid_colors[color6] __________________________
color = ('a', 1, 1, 1)
@pytest.mark.parametrize("color", invalid_colors)
def test_invalid_colors(color):
with pytest.raises((ValueError, AttributeError, KeyError)):
> transform_color(color)
napari/utils/colormaps/_tests/test_color_to_array.py:50:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
napari/utils/colormaps/standardize_color.py:65: in transform_color
return _color_switch[colortype](colors)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
colors = ('a', 1, 1, 1)
```
...
```python-traceback
> color_array = np.atleast_2d(np.asarray(colors))
E FutureWarning: Promotion of numbers and bools to strings is deprecated. In the future, code such as `np.concatenate((['string'], [0]))` will raise an error, while `np.asarray(['string', 0])` will return an array with `dtype=object`. To avoid the warning while retaining a string result use `dtype='U'` (or 'S'). To get an array of Python objects use `dtype=object`. (Warning added in NumPy 1.21)
```
|
1.0
|
pre color test fail from numpy 1.21 - ## 🐛 Bug
Our pre-tests are currently failing - it's a fairly innocuous fail, so no need to rush a release, but we should fix them. The test is passing an intentionally invalid `color = ('a', 1, 1, 1)` which is now raising a warning that didn't used to happen. I'm not quite sure how we want to fix, I can see many different ways - but given that no-one should really pass a color with a string in it as long as we do something reasonable I don't mind
```python-traceback
_________________________ test_invalid_colors[color6] __________________________
color = ('a', 1, 1, 1)
@pytest.mark.parametrize("color", invalid_colors)
def test_invalid_colors(color):
with pytest.raises((ValueError, AttributeError, KeyError)):
> transform_color(color)
napari/utils/colormaps/_tests/test_color_to_array.py:50:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
napari/utils/colormaps/standardize_color.py:65: in transform_color
return _color_switch[colortype](colors)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
colors = ('a', 1, 1, 1)
```
...
```python-traceback
> color_array = np.atleast_2d(np.asarray(colors))
E FutureWarning: Promotion of numbers and bools to strings is deprecated. In the future, code such as `np.concatenate((['string'], [0]))` will raise an error, while `np.asarray(['string', 0])` will return an array with `dtype=object`. To avoid the warning while retaining a string result use `dtype='U'` (or 'S'). To get an array of Python objects use `dtype=object`. (Warning added in NumPy 1.21)
```
|
non_process
|
pre color test fail from numpy 🐛 bug our pre tests are currently failing it s a fairly innocuous fail so no need to rush a release but we should fix them the test is passing an intentionally invalid color a which is now raising a warning that didn t used to happen i m not quite sure how we want to fix i can see many different ways but given that no one should really pass a color with a string in it as long as we do something reasonable i don t mind python traceback test invalid colors color a pytest mark parametrize color invalid colors def test invalid colors color with pytest raises valueerror attributeerror keyerror transform color color napari utils colormaps tests test color to array py napari utils colormaps standardize color py in transform color return color switch colors colors a python traceback color array np atleast np asarray colors e futurewarning promotion of numbers and bools to strings is deprecated in the future code such as np concatenate will raise an error while np asarray will return an array with dtype object to avoid the warning while retaining a string result use dtype u or s to get an array of python objects use dtype object warning added in numpy
| 0
|
48,960
| 10,311,191,420
|
IssuesEvent
|
2019-08-29 16:45:34
|
rbeezer/mathbook
|
https://api.github.com/repos/rbeezer/mathbook
|
opened
|
More economical knowl production
|
code cleanup z-xref
|
"xref" knowls can be built by searching the source for all cross-references and collecting these pointers. Perhaps duplicates can be removed in a manner similar to how the EPUB image list is created without duplicates. Then only necessary knowl content need be built.
Don't forget index entries and contributors.
The "-hidden" knowls are meant to impersonate a "born-hidden" knowl that is contained within a "xref" knowl. Example: a "xref" points to a "theorem", so there is knowl content which has been sanitized of unique id's, etc. But the "proof" is born hidden as part of the original theorem, so there needs to be a knowl inside the knowl, and its content must be sanitized also. This is where the "-hidden" knowl for the proof is employed. Check to see if these are only manufactured as part of building the born-hidden knowl. Note that these "-hidden" knowls have slightly different content than the "xref" knowls.
|
1.0
|
More economical knowl production - "xref" knowls can be built by searching the source for all cross-references and collecting these pointers. Perhaps duplicates can be removed in a manner similar to how the EPUB image list is created without duplicates. Then only necessary knowl content need be built.
Don't forget index entries and contributors.
The "-hidden" knowls are meant to impersonate a "born-hidden" knowl that is contained within a "xref" knowl. Example: a "xref" points to a "theorem", so there is knowl content which has been sanitized of unique id's, etc. But the "proof" is born hidden as part of the original theorem, so there needs to be a knowl inside the knowl, and its content must be sanitized also. This is where the "-hidden" knowl for the proof is employed. Check to see if these are only manufactured as part of building the born-hidden knowl. Note that these "-hidden" knowls have slightly different content than the "xref" knowls.
|
non_process
|
more economical knowl production xref knowls can be built by searching the source for all cross references and collecting these pointers perhaps duplicates can be removed in a manner similar to how the epub image list is created without duplicates then only necessary knowl content need be built don t forget index entries and contributors the hidden knowls are meant to impersonate a born hidden knowl that is contained within a xref knowl example a xref points to a theorem so there is knowl content which has been sanitized of unique id s etc but the proof is born hidden as part of the original theorem so there needs to be a knowl inside the knowl and its content must be sanitized also this is where the hidden knowl for the proof is employed check to see if these are only manufactured as part of building the born hidden knowl note that these hidden knowls have slightly different content than the xref knowls
| 0
|
59,046
| 14,365,925,811
|
IssuesEvent
|
2020-12-01 03:02:10
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
closed
|
Vulnerability roundup 91: qemu-4.2.0: 1 advisory [7.9]
|
1.severity: security
|
[search](https://search.nix.gsc.io/?q=qemu&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=qemu+in%3Apath&type=Code)
* [ ] [CVE-2020-15863](https://nvd.nist.gov/vuln/detail/CVE-2020-15863) CVSSv3=7.9 (nixos-20.03)
Scanned versions: nixos-20.03: 977000f149b. May contain false positives.
Cc @edolstra
|
True
|
Vulnerability roundup 91: qemu-4.2.0: 1 advisory [7.9] - [search](https://search.nix.gsc.io/?q=qemu&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=qemu+in%3Apath&type=Code)
* [ ] [CVE-2020-15863](https://nvd.nist.gov/vuln/detail/CVE-2020-15863) CVSSv3=7.9 (nixos-20.03)
Scanned versions: nixos-20.03: 977000f149b. May contain false positives.
Cc @edolstra
|
non_process
|
vulnerability roundup qemu advisory nixos scanned versions nixos may contain false positives cc edolstra
| 0
|
13,474
| 15,982,307,071
|
IssuesEvent
|
2021-04-18 03:14:41
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
opened
|
Change term - country
|
Class - Location Process - ready for public comment Term - change
|
## Change term
* Submitter: John Wieczorek (following issue raised by Ian Engelbrecht @ianengelbrecht)
* Justification (why is this change necessary?): Clarity
* Proponents (who needs this change): Everyone
Proposed new attributes of the term:
* Term name (in lowerCamelCase): country
* Organized in Class (e.g. Location, Taxon): Location
* Definition of the term: The name of the country or major administrative unit in which the Location occurs.
* Usage comments (recommendations regarding content, etc.): Recommended best practice is to use a controlled vocabulary such as the Getty Thesaurus of Geographic Names. Recommended best practice is to leave this field blank if the Location spans multiple entities at this administrative level or if the Location might be in one or another of multiple possible entities at this level. Multiplicity and uncertainty of the geographic entity can be captured either in the term higherGeography or in the term locality, or both.
* Examples: `Denmark`, `Colombia`, `España`
* Refines (identifier of the broader term this term refines, if applicable): None
* Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): http://rs.tdwg.org/dwc/terms/version/country-2017-10-06
* ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG, if applicable): DataSets/DataSet/Units/Unit/Gathering/Country/Name
This change proposal arises from discussions in Issue #221 and https://github.com/tdwg/dwc-qa/issues/141.
I would like to recommend the same exact amendment to the usage notes for each of the geography terms continent, waterbody, islandGroup, island, stateProvince, country, municipality.
|
1.0
|
Change term - country - ## Change term
* Submitter: John Wieczorek (following issue raised by Ian Engelbrecht @ianengelbrecht)
* Justification (why is this change necessary?): Clarity
* Proponents (who needs this change): Everyone
Proposed new attributes of the term:
* Term name (in lowerCamelCase): country
* Organized in Class (e.g. Location, Taxon): Location
* Definition of the term: The name of the country or major administrative unit in which the Location occurs.
* Usage comments (recommendations regarding content, etc.): Recommended best practice is to use a controlled vocabulary such as the Getty Thesaurus of Geographic Names. Recommended best practice is to leave this field blank if the Location spans multiple entities at this administrative level or if the Location might be in one or another of multiple possible entities at this level. Multiplicity and uncertainty of the geographic entity can be captured either in the term higherGeography or in the term locality, or both.
* Examples: `Denmark`, `Colombia`, `España`
* Refines (identifier of the broader term this term refines, if applicable): None
* Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): http://rs.tdwg.org/dwc/terms/version/country-2017-10-06
* ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG, if applicable): DataSets/DataSet/Units/Unit/Gathering/Country/Name
This change proposal arises from discussions in Issue #221 and https://github.com/tdwg/dwc-qa/issues/141.
I would like to recommend the same exact amendment to the usage notes for each of the geography terms continent, waterbody, islandGroup, island, stateProvince, country, municipality.
|
process
|
change term country change term submitter john wieczorek following issue raised by ian engelbrecht ianengelbrecht justification why is this change necessary clarity proponents who needs this change everyone proposed new attributes of the term term name in lowercamelcase country organized in class e g location taxon location definition of the term the name of the country or major administrative unit in which the location occurs usage comments recommendations regarding content etc recommended best practice is to use a controlled vocabulary such as the getty thesaurus of geographic names recommended best practice is to leave this field blank if the location spans multiple entities at this administrative level or if the location might be in one or another of multiple possible entities at this level multiplicity and uncertainty of the geographic entity can be captured either in the term highergeography or in the term locality or both examples denmark colombia españa refines identifier of the broader term this term refines if applicable none replaces identifier of the existing term that would be deprecated and replaced by this term if applicable abcd xpath of the equivalent term in abcd or efg if applicable datasets dataset units unit gathering country name this change proposal arises from discussions in issue and i would like to recommend the same exact amendment to the usage notes for each of the geography terms continent waterbody islandgroup island stateprovince country municipality
| 1
|
15,157
| 18,908,990,458
|
IssuesEvent
|
2021-11-16 12:11:16
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
closed
|
Enough analytics to pass live assessment
|
epic analytics refining team processes
|
## What
Do enough analytics to pass the live assessment
## Why
There's a backlog of analytics we need to do on the team. We plan to narrow the scope to doing enough analytics to pass the live assessment before we tackle some of the other areas.
Part of this work will also be looking at some of the analytics work we already started with the implementation of GA, reviewing support data.
## Who needs to know about this
PA, PM, CD, UR
## Done when
- [x] https://github.com/alphagov/govuk-design-system/issues/1820
- [x] [Get GA live on the site](https://github.com/alphagov/govuk-design-system/issues/1610) - Epic
- [x] [Set a date for live assessment](https://github.com/alphagov/govuk-design-system/issues/1837)
- [x] [Set up search tracking on the design system website](https://github.com/alphagov/govuk-design-system/issues/1623)
- [x] [Refine the performance framework ](https://github.com/alphagov/govuk-design-system/issues/1839)
- [x] [Set up dashboard to monitor internal site search ](https://github.com/alphagov/govuk-design-system/issues/1838)
- [x] [Set up dashboard to monitor email performance](https://github.com/alphagov/govuk-design-system/issues/1840)
- [x] [Set up heatmap dashboard for the design system website homepage ](https://github.com/alphagov/govuk-design-system/issues/1841)
- [x] [Analyse performance of the website and identity some key journeys](https://github.com/alphagov/govuk-design-system/issues/1842)
- [x] [Create a slide deck for live assessment](https://github.com/alphagov/govuk-design-system/issues/1844)
- [x] Do the live assessment
|
1.0
|
Enough analytics to pass live assessment - ## What
Do enough analytics to pass the live assessment
## Why
There's a backlog of analytics we need to do on the team. We plan to narrow the scope to doing enough analytics to pass the live assessment before we tackle some of the other areas.
Part of this work will also be looking at some of the analytics work we already started with the implementation of GA, reviewing support data.
## Who needs to know about this
PA, PM, CD, UR
## Done when
- [x] https://github.com/alphagov/govuk-design-system/issues/1820
- [x] [Get GA live on the site](https://github.com/alphagov/govuk-design-system/issues/1610) - Epic
- [x] [Set a date for live assessment](https://github.com/alphagov/govuk-design-system/issues/1837)
- [x] [Set up search tracking on the design system website](https://github.com/alphagov/govuk-design-system/issues/1623)
- [x] [Refine the performance framework ](https://github.com/alphagov/govuk-design-system/issues/1839)
- [x] [Set up dashboard to monitor internal site search ](https://github.com/alphagov/govuk-design-system/issues/1838)
- [x] [Set up dashboard to monitor email performance](https://github.com/alphagov/govuk-design-system/issues/1840)
- [x] [Set up heatmap dashboard for the design system website homepage ](https://github.com/alphagov/govuk-design-system/issues/1841)
- [x] [Analyse performance of the website and identity some key journeys](https://github.com/alphagov/govuk-design-system/issues/1842)
- [x] [Create a slide deck for live assessment](https://github.com/alphagov/govuk-design-system/issues/1844)
- [x] Do the live assessment
|
process
|
enough analytics to pass live assessment what do enough analytics to pass the live assessment why there s a backlog of analytics we need to do on the team we plan to narrow the scope to doing enough analytics to pass the live assessment before we tackle some of the other areas part of this work will also be looking at some of the analytics work we already started with the implementation of ga reviewing support data who needs to know about this pa pm cd ur done when epic do the live assessment
| 1
|
13,324
| 15,786,849,534
|
IssuesEvent
|
2021-04-01 18:20:52
|
bitpal/bitpal_umbrella
|
https://api.github.com/repos/bitpal/bitpal_umbrella
|
opened
|
We should always operate on base units
|
Payment processor
|
We shouldn't deal with floats or ints directly, only the general `BaseUnit` and the exact `Decimal`, to avoid accidental misuse or floating point issues.
|
1.0
|
We should always operate on base units - We shouldn't deal with floats or ints directly, only the general `BaseUnit` and the exact `Decimal`, to avoid accidental misuse or floating point issues.
|
process
|
we should always operate on base units we shouldn t deal with floats or ints directly only the general baseunit and the exact decimal to avoid accidental misuse or floating point issues
| 1
|
17,343
| 23,166,872,356
|
IssuesEvent
|
2022-07-30 04:37:08
|
NationalSecurityAgency/ghidra
|
https://api.github.com/repos/NationalSecurityAgency/ghidra
|
closed
|
Ghidra Ignores Sign-Extension of Memory Operand Displacement Values
|
Feature: Processor/x86 Status: Internal
|
**Describe the bug**
Displacement values (as I understand them) when calculated as part of an effective memory address should be sign-extended to the size specified in the instruction encoding. While Ghidra correctly identifies the size of the displacement value, it does not sign-extend it. This produces inaccurate/misleading disassembly and incorrect instruction semantics in PCode.
Consider the following opcodes:
`64 48 8b 0c 25 c8 ff ff ff`
IDA Disassembly:
`mov rcx, fs:0FFFFFFFFFFFFFFC8h`
Binary Ninja Disassembly:
`mov rcx, qword [fs:0xffffffffffffffc8]`
Zydis Disassembly:
`mov rcx, qword ptr fs:[0xFFFFFFFFFFFFFFC8]`
Ghidra Disassembly:
`MOV RCX,qword ptr FS:[0xffffffc8]`
Additionally, the PCode semantics for this instruction results in the following:
```
$U4f00:8 = INT_ADD FS_OFFSET, 0xffffffc8:8
$Uc000:8 = LOAD ram($U4f00:8)
RCX = COPY $Uc000:8
```
In this situation, the value `0xffffffc8` is of size `8` (64-bits), so the result of the add operation (which is also 64-bits in size) will not result in the correct result. Consider the semantics produced by two other popular analysis tools:
BinaryAnalysisPlatform (BAP) Semantics:
```
RCX := mem[FS_BASE - 0x38, el]:u64
```
Binary Ninja Semantics:
```
rcx = [fsbase - 0x38].q
```
**To Reproduce**
Steps to reproduce the behavior:
1. Write the opcodes `64 48 8b 0c 25 c8 ff ff ff` into a binary file
2. Load the file in Ghidra and disassemble as x86-64
3. View Disassembly and PCode
**Expected behavior**
The results of the disassembly to sign-extend the displacement in a manner consistent with other tools.
**Environment (please complete the following information):**
- OS: Ubuntu 20.04
- Java Version: 11.0.7
- Ghidra Version: 10.1.4
- Ghidra Origin: Official GitHub release
|
1.0
|
Ghidra Ignores Sign-Extension of Memory Operand Displacement Values - **Describe the bug**
Displacement values (as I understand them) when calculated as part of an effective memory address should be sign-extended to the size specified in the instruction encoding. While Ghidra correctly identifies the size of the displacement value, it does not sign-extend it. This produces inaccurate/misleading disassembly and incorrect instruction semantics in PCode.
Consider the following opcodes:
`64 48 8b 0c 25 c8 ff ff ff`
IDA Disassembly:
`mov rcx, fs:0FFFFFFFFFFFFFFC8h`
Binary Ninja Disassembly:
`mov rcx, qword [fs:0xffffffffffffffc8]`
Zydis Disassembly:
`mov rcx, qword ptr fs:[0xFFFFFFFFFFFFFFC8]`
Ghidra Disassembly:
`MOV RCX,qword ptr FS:[0xffffffc8]`
Additionally, the PCode semantics for this instruction results in the following:
```
$U4f00:8 = INT_ADD FS_OFFSET, 0xffffffc8:8
$Uc000:8 = LOAD ram($U4f00:8)
RCX = COPY $Uc000:8
```
In this situation, the value `0xffffffc8` is of size `8` (64-bits), so the result of the add operation (which is also 64-bits in size) will not result in the correct result. Consider the semantics produced by two other popular analysis tools:
BinaryAnalysisPlatform (BAP) Semantics:
```
RCX := mem[FS_BASE - 0x38, el]:u64
```
Binary Ninja Semantics:
```
rcx = [fsbase - 0x38].q
```
**To Reproduce**
Steps to reproduce the behavior:
1. Write the opcodes `64 48 8b 0c 25 c8 ff ff ff` into a binary file
2. Load the file in Ghidra and disassemble as x86-64
3. View Disassembly and PCode
**Expected behavior**
The results of the disassembly to sign-extend the displacement in a manner consistent with other tools.
**Environment (please complete the following information):**
- OS: Ubuntu 20.04
- Java Version: 11.0.7
- Ghidra Version: 10.1.4
- Ghidra Origin: Official GitHub release
|
process
|
ghidra ignores sign extension of memory operand displacement values describe the bug displacement values as i understand them when calculated as part of an effective memory address should be sign extended to the size specified in the instruction encoding while ghidra correctly identifies the size of the displacement value it does not sign extend it this produces inaccurate misleading disassembly and incorrect instruction semantics in pcode consider the following opcodes ff ff ff ida disassembly mov rcx fs binary ninja disassembly mov rcx qword zydis disassembly mov rcx qword ptr fs ghidra disassembly mov rcx qword ptr fs additionally the pcode semantics for this instruction results in the following int add fs offset load ram rcx copy in this situation the value is of size bits so the result of the add operation which is also bits in size will not result in the correct result consider the semantics produced by two other popular analysis tools binaryanalysisplatform bap semantics rcx mem binary ninja semantics rcx q to reproduce steps to reproduce the behavior write the opcodes ff ff ff into a binary file load the file in ghidra and disassemble as view disassembly and pcode expected behavior the results of the disassembly to sign extend the displacement in a manner consistent with other tools environment please complete the following information os ubuntu java version ghidra version ghidra origin official github release
| 1
|
19,792
| 26,177,057,299
|
IssuesEvent
|
2023-01-02 11:04:58
|
Sunbird-cQube/community
|
https://api.github.com/repos/Sunbird-cQube/community
|
closed
|
move files generated through configuration( data processing config file) to AWS S3 or on-premise or AZURE blob storage instead of storing in cQube installed instance/machine.
|
enhancement Backlog Processing cQube 4.0 - To be analysed
|
**Current design** : Files are generated through data processing configuration are stored in the instance/machine where cQube is installed.
**Proposed design**: move all files generated by data processing configuration to AWS S3 or on-premise or AZURE blob storage instead of storing in cQube installed instance/machine.
|
1.0
|
move files generated through configuration( data processing config file) to AWS S3 or on-premise or AZURE blob storage instead of storing in cQube installed instance/machine. - **Current design** : Files are generated through data processing configuration are stored in the instance/machine where cQube is installed.
**Proposed design**: move all files generated by data processing configuration to AWS S3 or on-premise or AZURE blob storage instead of storing in cQube installed instance/machine.
|
process
|
move files generated through configuration data processing config file to aws or on premise or azure blob storage instead of storing in cqube installed instance machine current design files are generated through data processing configuration are stored in the instance machine where cqube is installed proposed design move all files generated by data processing configuration to aws or on premise or azure blob storage instead of storing in cqube installed instance machine
| 1
|
327,400
| 24,133,574,285
|
IssuesEvent
|
2022-09-21 09:25:21
|
falcosecurity/libs
|
https://api.github.com/repos/falcosecurity/libs
|
opened
|
[TRACKING] Libs Release
|
kind/documentation
|
As you may know for this libs release we will try a new form of release process using a release branch, you can find more details in this document :point_down:
https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md
We have already terminated the [preparation phase](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#preparation) and all the PRs have the right milestone, so we can directly proceed with the [code freeze phase](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#preparation)!
During this period the rationale is that no new-feature PRs are allowed to be merged (apart from [exceptions](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#exceptions)). We will use this time to test the libraries before opening the release branch and tagging the code. Any help in testing is always welcome! :)
As always we will use this issue to keep you updated about the release process, so if you need updates this is the right place to come :)
|
1.0
|
[TRACKING] Libs Release - As you may know for this libs release we will try a new form of release process using a release branch, you can find more details in this document :point_down:
https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md
We have already terminated the [preparation phase](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#preparation) and all the PRs have the right milestone, so we can directly proceed with the [code freeze phase](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#preparation)!
During this period the rationale is that no new-feature PRs are allowed to be merged (apart from [exceptions](https://github.com/falcosecurity/libs/blob/33ca24e29790092cf5176fee5fd02cfaa1f5591a/release.md#exceptions)). We will use this time to test the libraries before opening the release branch and tagging the code. Any help in testing is always welcome! :)
As always we will use this issue to keep you updated about the release process, so if you need updates this is the right place to come :)
|
non_process
|
libs release as you may know for this libs release we will try a new form of release process using a release branch you can find more details in this document point down we have already terminated the and all the prs have the right milestone so we can directly proceed with the during this period the rationale is that no new feature prs are allowed to be merged apart from we will use this time to test the libraries before opening the release branch and tagging the code any help in testing is always welcome as always we will use this issue to keep you updated about the release process so if you need updates this is the right place to come
| 0
|
3,281
| 6,368,817,658
|
IssuesEvent
|
2017-08-01 10:09:53
|
zero-os/0-stor
|
https://api.github.com/repos/zero-os/0-stor
|
closed
|
new method to allow new consumer to (de)register as user to the data
|
process_wontfix
|
- to add consumer to consumer list
- also remove
use case
- allow multiple consumers to allocate same data
|
1.0
|
new method to allow new consumer to (de)register as user to the data - - to add consumer to consumer list
- also remove
use case
- allow multiple consumers to allocate same data
|
process
|
new method to allow new consumer to de register as user to the data to add consumer to consumer list also remove use case allow multiple consumers to allocate same data
| 1
|
17,437
| 23,261,037,938
|
IssuesEvent
|
2022-08-04 13:31:43
|
OpenDataScotland/the_od_bods
|
https://api.github.com/repos/OpenDataScotland/the_od_bods
|
opened
|
Add Stagecoach Open Data as a source
|
research data processing
|
On the back of some Twitter enquiries about bus open data I discovered Stagecoach publish their schedules and fares as open data: https://www.stagecoachbus.com/open-data
As these are just file downloads as a page we'll need to write a scraper for this.
## Considerations
- What is the license for these? It doesn't appear to be explicitly stated. Maybe worth getting in touch with Stagecoach on the email address on the page to ask them
- The file downloads themselves are just zip files that are split up by region that contain XML files
- Should we consider unzipping these files and serving the individual XML files?
- The file downloads cover regions outside Scotland (e.g. England and Wales). Should we include these?
|
1.0
|
Add Stagecoach Open Data as a source - On the back of some Twitter enquiries about bus open data I discovered Stagecoach publish their schedules and fares as open data: https://www.stagecoachbus.com/open-data
As these are just file downloads as a page we'll need to write a scraper for this.
## Considerations
- What is the license for these? It doesn't appear to be explicitly stated. Maybe worth getting in touch with Stagecoach on the email address on the page to ask them
- The file downloads themselves are just zip files that are split up by region that contain XML files
- Should we consider unzipping these files and serving the individual XML files?
- The file downloads cover regions outside Scotland (e.g. England and Wales). Should we include these?
|
process
|
add stagecoach open data as a source on the back of some twitter enquiries about bus open data i discovered stagecoach publish their schedules and fares as open data as these are just file downloads as a page we ll need to write a scraper for this considerations what is the license for these it doesn t appear to be explicitly stated maybe worth getting in touch with stagecoach on the email address on the page to ask them the file downloads themselves are just zip files that are split up by region that contain xml files should we consider unzipping these files and serving the individual xml files the file downloads cover regions outside scotland e g england and wales should we include these
| 1
|
19,363
| 25,493,317,104
|
IssuesEvent
|
2022-11-27 11:10:20
|
altillimity/SatDump
|
https://api.github.com/repos/altillimity/SatDump
|
closed
|
Himawaricast - stol argument out of range
|
bug Processing
|
**Description of the issue**
When trying to process himawaricast transport stream
**Hardware (SDR/PC/OS)**
TBS5927
**Version (Eg, 1.0.0, CI Build #171)**
**Logs after the cras (satdump.logs)**
[11/22/22 - 13:30:53] (I) Using input frames E:\DVB_stream\0.0E_4148.146_H_2586_(2021-05-03 02.58.45)_dump.ts
[11/22/22 - 13:30:53] (I) Decoding to C:\Users\john\Desktop\HimawariTest
[11/22/22 - 13:30:53] (I) New xRIT file : IMG_DK01B11_202105021750_004
[11/22/22 - 13:30:53] (D) This is image data. Size 2750x275
[11/22/22 - 13:30:53] (E) Fatal error running pipeline : stol argument out of range
[11/22/22 - 13:31:12] (I) Saving user config at settings.json
[11/22/22 - 13:31:12] (I) UI Exit
[11/22/22 - 13:31:12] (I) Exiting!
**Other info (Eg, Screenshots) / Files useful for debugging (CADU, etc)**

|
1.0
|
Himawaricast - stol argument out of range - **Description of the issue**
When trying to process himawaricast transport stream
**Hardware (SDR/PC/OS)**
TBS5927
**Version (Eg, 1.0.0, CI Build #171)**
**Logs after the cras (satdump.logs)**
[11/22/22 - 13:30:53] (I) Using input frames E:\DVB_stream\0.0E_4148.146_H_2586_(2021-05-03 02.58.45)_dump.ts
[11/22/22 - 13:30:53] (I) Decoding to C:\Users\john\Desktop\HimawariTest
[11/22/22 - 13:30:53] (I) New xRIT file : IMG_DK01B11_202105021750_004
[11/22/22 - 13:30:53] (D) This is image data. Size 2750x275
[11/22/22 - 13:30:53] (E) Fatal error running pipeline : stol argument out of range
[11/22/22 - 13:31:12] (I) Saving user config at settings.json
[11/22/22 - 13:31:12] (I) UI Exit
[11/22/22 - 13:31:12] (I) Exiting!
**Other info (Eg, Screenshots) / Files useful for debugging (CADU, etc)**

|
process
|
himawaricast stol argument out of range description of the issue when trying to process himawaricast transport stream hardware sdr pc os version eg ci build logs after the cras satdump logs i using input frames e dvb stream h dump ts i decoding to c users john desktop himawaritest i new xrit file img d this is image data size e fatal error running pipeline stol argument out of range i saving user config at settings json i ui exit i exiting other info eg screenshots files useful for debugging cadu etc
| 1
|
2,430
| 5,205,204,671
|
IssuesEvent
|
2017-01-24 17:19:37
|
AnalyticalGraphicsInc/cesium
|
https://api.github.com/repos/AnalyticalGraphicsInc/cesium
|
reopened
|
Support CORS in Cesium's development server
|
beginner dev process
|
At least when it is running locally.
After
```
app.use(compression());
```
add
```
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
```
|
1.0
|
Support CORS in Cesium's development server - At least when it is running locally.
After
```
app.use(compression());
```
add
```
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
```
|
process
|
support cors in cesium s development server at least when it is running locally after app use compression add app use function req res next res header access control allow origin res header access control allow headers origin x requested with content type accept next
| 1
|
11,902
| 14,697,828,814
|
IssuesEvent
|
2021-01-04 04:44:52
|
vionwinnie/tableTennis-product-review
|
https://api.github.com/repos/vionwinnie/tableTennis-product-review
|
closed
|
Clean comment data for identifying comparative statement
|
data preprocessing
|
- creating a wordcloud for identifying useful lexicon using existing comment
|
1.0
|
Clean comment data for identifying comparative statement - - creating a wordcloud for identifying useful lexicon using existing comment
|
process
|
clean comment data for identifying comparative statement creating a wordcloud for identifying useful lexicon using existing comment
| 1
|
10,373
| 13,190,848,189
|
IssuesEvent
|
2020-08-13 10:58:41
|
OUDcollective/twenty20times
|
https://api.github.com/repos/OUDcollective/twenty20times
|
closed
|
Linking a pull request to an issue - GitHub Help
|
workflow-process
|

---
**Source URL**:
[https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue](https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue)
<table><tr><td><strong>Browser</strong></td><td>Chrome 84.0.4147.56</td></tr><tr><td><strong>OS</strong></td><td>Windows 10 64-bit</td></tr><tr><td><strong>Screen Size</strong></td><td>2560x1080</td></tr><tr><td><strong>Viewport Size</strong></td><td>2560x888</td></tr><tr><td><strong>Pixel Ratio</strong></td><td>@1x</td></tr><tr><td><strong>Zoom Level</strong></td><td>100%</td></tr></table>
|
1.0
|
Linking a pull request to an issue - GitHub Help - 
---
**Source URL**:
[https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue](https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue)
<table><tr><td><strong>Browser</strong></td><td>Chrome 84.0.4147.56</td></tr><tr><td><strong>OS</strong></td><td>Windows 10 64-bit</td></tr><tr><td><strong>Screen Size</strong></td><td>2560x1080</td></tr><tr><td><strong>Viewport Size</strong></td><td>2560x888</td></tr><tr><td><strong>Pixel Ratio</strong></td><td>@1x</td></tr><tr><td><strong>Zoom Level</strong></td><td>100%</td></tr></table>
|
process
|
linking a pull request to an issue github help source url browser chrome os windows bit screen size viewport size pixel ratio zoom level
| 1
|
1,974
| 4,804,047,596
|
IssuesEvent
|
2016-11-02 12:15:55
|
bogas04/SikhJS
|
https://api.github.com/repos/bogas04/SikhJS
|
opened
|
Use webpack 2 and webpack-devserver
|
devprocess
|
Since you don't have a backend, using webpack-devserver is better than `ws` and `webpack -w`.
Webpack 2 with treeshaking will hopefully provided smaller bundle size.
* [ ] webpack 2
* [ ] webpack-devserver
* [ ] React hot reloading
|
1.0
|
Use webpack 2 and webpack-devserver - Since you don't have a backend, using webpack-devserver is better than `ws` and `webpack -w`.
Webpack 2 with treeshaking will hopefully provided smaller bundle size.
* [ ] webpack 2
* [ ] webpack-devserver
* [ ] React hot reloading
|
process
|
use webpack and webpack devserver since you don t have a backend using webpack devserver is better than ws and webpack w webpack with treeshaking will hopefully provided smaller bundle size webpack webpack devserver react hot reloading
| 1
|
17,330
| 23,146,366,412
|
IssuesEvent
|
2022-07-29 01:34:43
|
knative/serving
|
https://api.github.com/repos/knative/serving
|
closed
|
Mitigating Cold Start times: Prewarmed pods pool
|
kind/feature area/autoscale kind/process lifecycle/stale
|
/area autoscale
/kind process
## Describe the feature
The feature we wanted to request is to implement a prewarmed pool of pods, so the time used in requesting and launching the pod with the code could be reduced.
The idea would be to have some kind of pool of pods ready to receive a code injection, so the time of the cold start would be only the time that the code takes to be ready. (More or less, like in Fission, related to the 4.5.1 chapter, 2 section, part (a): PoolManager: https://atlarge-research.com/pdfs/Petar_Galic___Literature_Survey___Function_Management_Layer_Serverless.pdf)
The case for the miniscale=1, that always has a pod ready to receive requests, is a bit inefficient for the production scenario, since it would consume tons of resources in case we had 100+ different pods, which don't receive calls to often.
The objective would be to take the less amount of time to give a response without having to maintain an instance of each different pod.
(refs.
https://atlarge-research.com/pdfs/Petar_Galic___Literature_Survey___Function_Management_Layer_Serverless.pdf
https://arxiv.org/pdf/1903.12221.pdf)
|
1.0
|
Mitigating Cold Start times: Prewarmed pods pool - /area autoscale
/kind process
## Describe the feature
The feature we wanted to request is to implement a prewarmed pool of pods, so the time used in requesting and launching the pod with the code could be reduced.
The idea would be to have some kind of pool of pods ready to receive a code injection, so the time of the cold start would be only the time that the code takes to be ready. (More or less, like in Fission, related to the 4.5.1 chapter, 2 section, part (a): PoolManager: https://atlarge-research.com/pdfs/Petar_Galic___Literature_Survey___Function_Management_Layer_Serverless.pdf)
The case for the miniscale=1, that always has a pod ready to receive requests, is a bit inefficient for the production scenario, since it would consume tons of resources in case we had 100+ different pods, which don't receive calls to often.
The objective would be to take the less amount of time to give a response without having to maintain an instance of each different pod.
(refs.
https://atlarge-research.com/pdfs/Petar_Galic___Literature_Survey___Function_Management_Layer_Serverless.pdf
https://arxiv.org/pdf/1903.12221.pdf)
|
process
|
mitigating cold start times prewarmed pods pool area autoscale kind process describe the feature the feature we wanted to request is to implement a prewarmed pool of pods so the time used in requesting and launching the pod with the code could be reduced the idea would be to have some kind of pool of pods ready to receive a code injection so the time of the cold start would be only the time that the code takes to be ready more or less like in fission related to the chapter section part a poolmanager the case for the miniscale that always has a pod ready to receive requests is a bit inefficient for the production scenario since it would consume tons of resources in case we had different pods which don t receive calls to often the objective would be to take the less amount of time to give a response without having to maintain an instance of each different pod refs
| 1
|
5,960
| 3,703,084,060
|
IssuesEvent
|
2016-02-29 19:07:36
|
deis/deis
|
https://api.github.com/repos/deis/deis
|
closed
|
deis-builder 1.10.0 is spitting 'Failed handshake: EOF' once per ~5 seconds
|
builder easy-fix
|
Log as seen in papertrail:
```
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc82046a2a0}})
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:17 52.22.0.199 logger: 2015-09-08T22:07:17UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc82017f7a0}})
Sep 08 15:07:21 52.22.0.199 logger: 2015-09-08T22:07:21UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:21 52.22.0.199 logger: 2015-09-08T22:07:21UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:24 52.22.0.199 logger: 2015-09-08T22:07:24UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc820436770}})
Sep 08 15:07:25 52.22.0.199 logger: 2015-09-08T22:07:25UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:25 52.22.0.199 logger: 2015-09-08T22:07:25UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc820048fc0}})
Sep 08 15:07:32 52.22.0.199 logger: 2015-09-08T22:07:32UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc8202682a0}})
Sep 08 15:07:33 52.22.0.199 logger: 2015-09-08T22:07:33UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:33 52.22.0.199 logger: 2015-09-08T22:07:33UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:40 52.22.0.199 logger: 2015-09-08T22:07:40UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc8201bc7e0}})
```
|
1.0
|
deis-builder 1.10.0 is spitting 'Failed handshake: EOF' once per ~5 seconds - Log as seen in papertrail:
```
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc82046a2a0}})
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:11 52.22.0.199 logger: 2015-09-08T22:07:11UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:17 52.22.0.199 logger: 2015-09-08T22:07:17UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc82017f7a0}})
Sep 08 15:07:21 52.22.0.199 logger: 2015-09-08T22:07:21UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:21 52.22.0.199 logger: 2015-09-08T22:07:21UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:24 52.22.0.199 logger: 2015-09-08T22:07:24UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc820436770}})
Sep 08 15:07:25 52.22.0.199 logger: 2015-09-08T22:07:25UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:25 52.22.0.199 logger: 2015-09-08T22:07:25UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:29 52.22.0.199 logger: 2015-09-08T22:07:29UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc820048fc0}})
Sep 08 15:07:32 52.22.0.199 logger: 2015-09-08T22:07:32UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc8202682a0}})
Sep 08 15:07:33 52.22.0.199 logger: 2015-09-08T22:07:33UTC deis-builder[1]: [info] Checking closer.
Sep 08 15:07:33 52.22.0.199 logger: 2015-09-08T22:07:33UTC deis-builder[1]: [info] Accepted connection.
Sep 08 15:07:40 52.22.0.199 logger: 2015-09-08T22:07:40UTC deis-builder[1]: [error] Failed handshake: EOF (&{{0xc8201bc7e0}})
```
|
non_process
|
deis builder is spitting failed handshake eof once per seconds log as seen in papertrail sep logger deis builder failed handshake eof sep logger deis builder accepted connection sep logger deis builder checking closer sep logger deis builder failed handshake eof sep logger deis builder accepted connection sep logger deis builder checking closer sep logger deis builder failed handshake eof sep logger deis builder accepted connection sep logger deis builder checking closer sep logger deis builder checking closer sep logger deis builder accepted connection sep logger deis builder failed handshake eof sep logger deis builder failed handshake eof sep logger deis builder checking closer sep logger deis builder accepted connection sep logger deis builder failed handshake eof
| 0
|
768,095
| 26,953,259,076
|
IssuesEvent
|
2023-02-08 13:15:47
|
meower-media-co/Meower-Server
|
https://api.github.com/repos/meower-media-co/Meower-Server
|
closed
|
[cl4] Duplicate home posts
|
bug Cloudlink4 High Priority
|
**Describe the bug**
With a fresh install of a server with only one post in the homepage, the post is duplicated when accessing /v1/home.
**To Reproduce**
Steps to reproduce the behavior:
1. Authenticate with API: POST /v1/auth/password
2. POST /v1/home with a new post
3. GET /v1/home
**Expected behavior**
A single instance of a post ID should appear.
**Logs**
```
[2023-02-07 15:56:42 -0500] [14548] [DEBUG] Process ack: Sanic-Server-0-0 [14548]
[2023-02-07 15:56:42 -0500] [14548] [INFO] Starting worker [14548]
[2023-02-07 15:56:44 -0500] - (sanic.access)[INFO][127.0.0.1:64726]: GET http://127.0.0.1:3001/v1/home 200 3586
```
```json
{
"posts": [{
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}]
}
```
|
1.0
|
[cl4] Duplicate home posts - **Describe the bug**
With a fresh install of a server with only one post in the homepage, the post is duplicated when accessing /v1/home.
**To Reproduce**
Steps to reproduce the behavior:
1. Authenticate with API: POST /v1/auth/password
2. POST /v1/home with a new post
3. GET /v1/home
**Expected behavior**
A single instance of a post ID should appear.
**Logs**
```
[2023-02-07 15:56:42 -0500] [14548] [DEBUG] Process ack: Sanic-Server-0-0 [14548]
[2023-02-07 15:56:42 -0500] [14548] [INFO] Starting worker [14548]
[2023-02-07 15:56:44 -0500] - (sanic.access)[INFO][127.0.0.1:64726]: GET http://127.0.0.1:3001/v1/home 200 3586
```
```json
{
"posts": [{
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}, {
"id": "7028774404230217728",
"author": {
"id": "7026930989146308608",
"username": "MikeDEV",
"flags": 0,
"icon": {
"type": 0,
"data": 2
}
},
"content": "This is a test!",
"filtered_content": null,
"public_flags": 0,
"stats": {
"likes": 0,
"meows": 0,
"comments": 0
},
"time": 1675808406,
"delete_after": null
}]
}
```
|
non_process
|
duplicate home posts describe the bug with a fresh install of a server with only one post in the homepage the post is duplicated when accessing home to reproduce steps to reproduce the behavior authenticate with api post auth password post home with a new post get home expected behavior a single instance of a post id should appear logs process ack sanic server starting worker sanic access get json posts id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null id author id username mikedev flags icon type data content this is a test filtered content null public flags stats likes meows comments time delete after null
| 0
|
274,730
| 20,865,140,397
|
IssuesEvent
|
2022-03-22 06:00:57
|
microsoft/FluidFramework
|
https://api.github.com/repos/microsoft/FluidFramework
|
closed
|
Add Links for `TinyliciousClientProps` and `AzureClientProps` in Telemetry Doc
|
documentation telemetry status: stale
|
## Work Item
Add the corresponding package links to `TinyliciousClientProps` and `AzureClientProps`.
Links will be available once #7472 is settled
Reference: https://github.com/microsoft/FluidFramework/pull/7424#discussion_r708445247
|
1.0
|
Add Links for `TinyliciousClientProps` and `AzureClientProps` in Telemetry Doc - ## Work Item
Add the corresponding package links to `TinyliciousClientProps` and `AzureClientProps`.
Links will be available once #7472 is settled
Reference: https://github.com/microsoft/FluidFramework/pull/7424#discussion_r708445247
|
non_process
|
add links for tinyliciousclientprops and azureclientprops in telemetry doc work item add the corresponding package links to tinyliciousclientprops and azureclientprops links will be available once is settled reference
| 0
|
20,818
| 27,578,675,784
|
IssuesEvent
|
2023-03-08 14:47:54
|
ukri-excalibur/excalibur-tests
|
https://api.github.com/repos/ukri-excalibur/excalibur-tests
|
opened
|
Read perflog data into pandas dataframe
|
UCL postprocessing
|
Inter-connected tasks needed:
- [ ] Get the [handlers_perflog.format](https://reframe-hpc.readthedocs.io/en/stable/config_reference.html#config.logging.handlers_perflog.format) in https://github.com/ukri-excalibur/excalibur-tests/blob/87dde005a6b581448a9003c8dcbac47107019957/reframe_config.py#L393 to
- [ ] follow the pattern of `name=value` for every field
- [ ] add all the needed variables: `check_variables` dictionary of environment variables, tags, either `check_info` or `display_name` - see https://github.com/ukri-excalibur/excalibur-tests/issues/70#issuecomment-1453558145 and be aware of ReFrame version.
- [ ] add info encoded in the perflog file path (system, partition) as fields inside the perflog as well
- [ ] write perflog parser to be as generic as possible, assuming `name=value` format of fields. Use [existing code](https://github.com/ukri-excalibur/excalibur-tests/blob/d745085064a0ad1bd351fdf1ff8229ab13b4cbe2/modules/perf_logs.py) where possible.
- [ ] input perflog files can be coming from various tests of one app, various systems, or even more than one app, depending on the usecase. This will modify the way the input path will be searched for files.
- [ ] Fishing out the parameters values from the `info` or `display_name` field will need its own parsing.
|
1.0
|
Read perflog data into pandas dataframe - Inter-connected tasks needed:
- [ ] Get the [handlers_perflog.format](https://reframe-hpc.readthedocs.io/en/stable/config_reference.html#config.logging.handlers_perflog.format) in https://github.com/ukri-excalibur/excalibur-tests/blob/87dde005a6b581448a9003c8dcbac47107019957/reframe_config.py#L393 to
- [ ] follow the pattern of `name=value` for every field
- [ ] add all the needed variables: `check_variables` dictionary of environment variables, tags, either `check_info` or `display_name` - see https://github.com/ukri-excalibur/excalibur-tests/issues/70#issuecomment-1453558145 and be aware of ReFrame version.
- [ ] add info encoded in the perflog file path (system, partition) as fields inside the perflog as well
- [ ] write perflog parser to be as generic as possible, assuming `name=value` format of fields. Use [existing code](https://github.com/ukri-excalibur/excalibur-tests/blob/d745085064a0ad1bd351fdf1ff8229ab13b4cbe2/modules/perf_logs.py) where possible.
- [ ] input perflog files can be coming from various tests of one app, various systems, or even more than one app, depending on the usecase. This will modify the way the input path will be searched for files.
- [ ] Fishing out the parameters values from the `info` or `display_name` field will need its own parsing.
|
process
|
read perflog data into pandas dataframe inter connected tasks needed get the in to follow the pattern of name value for every field add all the needed variables check variables dictionary of environment variables tags either check info or display name see and be aware of reframe version add info encoded in the perflog file path system partition as fields inside the perflog as well write perflog parser to be as generic as possible assuming name value format of fields use where possible input perflog files can be coming from various tests of one app various systems or even more than one app depending on the usecase this will modify the way the input path will be searched for files fishing out the parameters values from the info or display name field will need its own parsing
| 1
|
985
| 3,441,763,673
|
IssuesEvent
|
2015-12-14 19:47:37
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Process.Start not working for HTTP links
|
System.Diagnostics.Process
|
On windows (with .NET Core) calling:
System.Diagnostics.Process.Start(SomeWebLinkHere);
Results in:
An exception of type 'System.ComponentModel.Win32Exception' occurred in System.Diagnostics.Process.dll but was not handled in user code.
Additional information: The system cannot find the file specified.
Works fine on the .NET Framework, is this intended?
|
1.0
|
Process.Start not working for HTTP links - On windows (with .NET Core) calling:
System.Diagnostics.Process.Start(SomeWebLinkHere);
Results in:
An exception of type 'System.ComponentModel.Win32Exception' occurred in System.Diagnostics.Process.dll but was not handled in user code.
Additional information: The system cannot find the file specified.
Works fine on the .NET Framework, is this intended?
|
process
|
process start not working for http links on windows with net core calling system diagnostics process start someweblinkhere results in an exception of type system componentmodel occurred in system diagnostics process dll but was not handled in user code additional information the system cannot find the file specified works fine on the net framework is this intended
| 1
|
4,702
| 7,543,410,053
|
IssuesEvent
|
2018-04-17 15:26:45
|
UnbFeelings/unb-feelings-docs
|
https://api.github.com/repos/UnbFeelings/unb-feelings-docs
|
opened
|
[Não conformidade] Não realização da coleta de métricas
|
Desenvolvimento Processo Qualidade invalid
|
@UnbFeelings/devel
Perante critérios definidos para as [Auditorias](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria) fora auditada a [Medição e Análise](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-da-Medi%C3%A7%C3%A3o-e-An%C3%A1lise-do-Ciclo-1). Como pode ser visto na [descrição das Não Conformidades](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-da-Medi%C3%A7%C3%A3o-e-An%C3%A1lise-do-Ciclo-1#4-n%C3%A3o-conformidades) foi encontrada a não conformidade aqui descrita.
### Descrição
Nenhuma métrica foi coletada e o time de desenvolvimento não está alinhado em relação ao plano de medição.
#### Recomendações
É recomendável que o time de processo se reúna com o time de desenvolvimento para discutir a importância da coleta de métricas e como fazê-la.
Com base na Política de Não Conformidades utilizando a matriz GUT, obteve-se uma pontuação de 48 pontos, o que se encaixa em problema mediano, assim o prazo para resolução da Não conformidade é de 3 dias
#### Detalhes
**Auditor**: Fabíola Fleury
**Técnica de Audição**: Entrevista Aberta e Fechada
**Tipo:** Desenvolvimento e Processo
**Prazo:** 20/04/2018
|
1.0
|
[Não conformidade] Não realização da coleta de métricas - @UnbFeelings/devel
Perante critérios definidos para as [Auditorias](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria) fora auditada a [Medição e Análise](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-da-Medi%C3%A7%C3%A3o-e-An%C3%A1lise-do-Ciclo-1). Como pode ser visto na [descrição das Não Conformidades](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-da-Medi%C3%A7%C3%A3o-e-An%C3%A1lise-do-Ciclo-1#4-n%C3%A3o-conformidades) foi encontrada a não conformidade aqui descrita.
### Descrição
Nenhuma métrica foi coletada e o time de desenvolvimento não está alinhado em relação ao plano de medição.
#### Recomendações
É recomendável que o time de processo se reúna com o time de desenvolvimento para discutir a importância da coleta de métricas e como fazê-la.
Com base na Política de Não Conformidades utilizando a matriz GUT, obteve-se uma pontuação de 48 pontos, o que se encaixa em problema mediano, assim o prazo para resolução da Não conformidade é de 3 dias
#### Detalhes
**Auditor**: Fabíola Fleury
**Técnica de Audição**: Entrevista Aberta e Fechada
**Tipo:** Desenvolvimento e Processo
**Prazo:** 20/04/2018
|
process
|
não realização da coleta de métricas unbfeelings devel perante critérios definidos para as fora auditada a como pode ser visto na foi encontrada a não conformidade aqui descrita descrição nenhuma métrica foi coletada e o time de desenvolvimento não está alinhado em relação ao plano de medição recomendações é recomendável que o time de processo se reúna com o time de desenvolvimento para discutir a importância da coleta de métricas e como fazê la com base na política de não conformidades utilizando a matriz gut obteve se uma pontuação de pontos o que se encaixa em problema mediano assim o prazo para resolução da não conformidade é de dias detalhes auditor fabíola fleury técnica de audição entrevista aberta e fechada tipo desenvolvimento e processo prazo
| 1
|
12,204
| 14,742,632,054
|
IssuesEvent
|
2021-01-07 12:38:01
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Missing Accounts
|
anc-process anp-urgent ant-bug
|
In GitLab by @pchaudhary on May 13, 2019, 07:24
**Submitted by:** "Gaylan Garrett" <Gaylan.Garrett@Nexa.com>
**HD:** http://www.servicedesk.answernet.com/profiles/ticket/2019-05-13-81538/conversation
**Server:** External
**Client/Site:** KNR
**Account:** KNR-3501 - Superior Plumbing, Heating & Air, In
KNR-3502 - Actikare, Inc
KNR-3503 - Steadfast Properties, LLC (HV)
KNR-3505 - Rent The Help
**Issue:**
It seems quite a few accounts are totally missing from SA billing. Examples I have found so far this morning are:
KNR-3501 - Superior Plumbing, Heating & Air, In
KNR-3502 - Actikare, Inc
KNR-3503 - Steadfast Properties, LLC (HV)
KNR-3505 - Rent The Help
Their account numbers are now 10 digits but I could not find them by name either. There may be more, I am just getting started this morning but wanted to get this out right away.
|
1.0
|
Missing Accounts - In GitLab by @pchaudhary on May 13, 2019, 07:24
**Submitted by:** "Gaylan Garrett" <Gaylan.Garrett@Nexa.com>
**HD:** http://www.servicedesk.answernet.com/profiles/ticket/2019-05-13-81538/conversation
**Server:** External
**Client/Site:** KNR
**Account:** KNR-3501 - Superior Plumbing, Heating & Air, In
KNR-3502 - Actikare, Inc
KNR-3503 - Steadfast Properties, LLC (HV)
KNR-3505 - Rent The Help
**Issue:**
It seems quite a few accounts are totally missing from SA billing. Examples I have found so far this morning are:
KNR-3501 - Superior Plumbing, Heating & Air, In
KNR-3502 - Actikare, Inc
KNR-3503 - Steadfast Properties, LLC (HV)
KNR-3505 - Rent The Help
Their account numbers are now 10 digits but I could not find them by name either. There may be more, I am just getting started this morning but wanted to get this out right away.
|
process
|
missing accounts in gitlab by pchaudhary on may submitted by gaylan garrett hd server external client site knr account knr superior plumbing heating air in knr actikare inc knr steadfast properties llc hv knr rent the help issue it seems quite a few accounts are totally missing from sa billing examples i have found so far this morning are knr superior plumbing heating air in knr actikare inc knr steadfast properties llc hv knr rent the help their account numbers are now digits but i could not find them by name either there may be more i am just getting started this morning but wanted to get this out right away
| 1
|
2,766
| 8,306,973,822
|
IssuesEvent
|
2018-09-23 01:57:56
|
MovingBlocks/Terasology
|
https://api.github.com/repos/MovingBlocks/Terasology
|
opened
|
Recover and finish parked serialization / type handling overhaul
|
Api Architecture Epic
|
See #3489 for background information. @eviltak is behind the parked code related to this and may be interested in continuing with it, but is low on time as of this writing. If anybody else is hugely interested in serialization and type handling feel free to look at this but be warned that it is a large piece of work and a complex topic :-)
Goal for this issue is to work with branch https://github.com/MovingBlocks/Terasology/tree/newSerialization to finish stabilizing it so it can be re-merged
As part of that work several modules need syntax changes to compile again, as was the case when originally merging #3449. The same modules and likely more beyond that will need changes after #3456 is considered as well, both for compile fixes (maybe just the same set of modules as the first PR) and runtime issues encountered in testing with the second PR (unsure if all those would go away purely with additional engine changes)
Affected modules by round one and their initial syntax fix commits (reverted as part of preparing this issue):
* https://github.com/Terasology/MasterOfOreon/commit/17a008952995a354e93f3cc2c71ed14ae8b7adec
* https://github.com/Terasology/DynamicCities/commit/bce035463cf156fc7208257f7e4adcb8bbabbf5e
* https://github.com/Terasology/Dialogs/commit/3b5804ebf557060c24a53a79760a3d3e80cf1064
* https://github.com/Terasology/Tasks/commit/b1576fb1011f8fd7886ff7a96007ce85d2895e52
* https://github.com/Terasology/TutorialDynamicCities/commit/e5766b8374f53e0d0b9a600f53aa7176f94876d1
* https://github.com/Terasology/LightAndShadow/commit/fa33f332569990f1c6cab306fe52635b402b06a5
To be clear those commits just let those modules compile after the *first* round of serialization changes. It may be pointless to re-apply those changes exactly, as the underlying changes from round two may require entirely different fixes.
See the PRs linked in #3489 for more research and discovered issues.
As a potential follow-up to this the new Record & Replay system is awaiting some of these changes to better improve its own usage of serialized events and such. @iaronaraujo would be the primary contributor involved in that effort.
|
1.0
|
Recover and finish parked serialization / type handling overhaul - See #3489 for background information. @eviltak is behind the parked code related to this and may be interested in continuing with it, but is low on time as of this writing. If anybody else is hugely interested in serialization and type handling feel free to look at this but be warned that it is a large piece of work and a complex topic :-)
Goal for this issue is to work with branch https://github.com/MovingBlocks/Terasology/tree/newSerialization to finish stabilizing it so it can be re-merged
As part of that work several modules need syntax changes to compile again, as was the case when originally merging #3449. The same modules and likely more beyond that will need changes after #3456 is considered as well, both for compile fixes (maybe just the same set of modules as the first PR) and runtime issues encountered in testing with the second PR (unsure if all those would go away purely with additional engine changes)
Affected modules by round one and their initial syntax fix commits (reverted as part of preparing this issue):
* https://github.com/Terasology/MasterOfOreon/commit/17a008952995a354e93f3cc2c71ed14ae8b7adec
* https://github.com/Terasology/DynamicCities/commit/bce035463cf156fc7208257f7e4adcb8bbabbf5e
* https://github.com/Terasology/Dialogs/commit/3b5804ebf557060c24a53a79760a3d3e80cf1064
* https://github.com/Terasology/Tasks/commit/b1576fb1011f8fd7886ff7a96007ce85d2895e52
* https://github.com/Terasology/TutorialDynamicCities/commit/e5766b8374f53e0d0b9a600f53aa7176f94876d1
* https://github.com/Terasology/LightAndShadow/commit/fa33f332569990f1c6cab306fe52635b402b06a5
To be clear those commits just let those modules compile after the *first* round of serialization changes. It may be pointless to re-apply those changes exactly, as the underlying changes from round two may require entirely different fixes.
See the PRs linked in #3489 for more research and discovered issues.
As a potential follow-up to this the new Record & Replay system is awaiting some of these changes to better improve its own usage of serialized events and such. @iaronaraujo would be the primary contributor involved in that effort.
|
non_process
|
recover and finish parked serialization type handling overhaul see for background information eviltak is behind the parked code related to this and may be interested in continuing with it but is low on time as of this writing if anybody else is hugely interested in serialization and type handling feel free to look at this but be warned that it is a large piece of work and a complex topic goal for this issue is to work with branch to finish stabilizing it so it can be re merged as part of that work several modules need syntax changes to compile again as was the case when originally merging the same modules and likely more beyond that will need changes after is considered as well both for compile fixes maybe just the same set of modules as the first pr and runtime issues encountered in testing with the second pr unsure if all those would go away purely with additional engine changes affected modules by round one and their initial syntax fix commits reverted as part of preparing this issue to be clear those commits just let those modules compile after the first round of serialization changes it may be pointless to re apply those changes exactly as the underlying changes from round two may require entirely different fixes see the prs linked in for more research and discovered issues as a potential follow up to this the new record replay system is awaiting some of these changes to better improve its own usage of serialized events and such iaronaraujo would be the primary contributor involved in that effort
| 0
|
3,198
| 6,262,107,121
|
IssuesEvent
|
2017-07-15 07:00:44
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Output data lost from spawned process if process ends before all data read
|
child_process question
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: v6.9.1
* **Platform**: Linux bld-1604 4.3.0 BrandZ virtual linux x86_64 x86_64 x86_64 GNU/Linux (Ubuntu 16.04)
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
We have a gulp build step that converts a protobuf file to JSON using a tool 'pbjs'. We are seeing a truncated output file. This problem only appears on Linux, Windows builds aren't affected.
The conversion is called thus:
```js
gulp.task('create-proto-json', ['prepare-proto-sources'], function () {
var pbjs_cmd = path.resolve('node_modules/protobufjs/bin/pbjs');
return gulp.src(targetFolder + 'proto-xxx.proto')
.pipe(exec('node "' + pbjs_cmd + '" ' + path.normalize(targetFolder + 'proto-xxx.proto') + ' --target=json', { pipeStdout: true, maxBuffer: 1000000 }))
.pipe(rename({extname: '.json'}))
.pipe(gulp.dest(targetFolder));
});
```
The conversion generates over 800K of output, but only 219K are written to the output file. Running strace on the command give some hint:
```
[pid 2112] 0.002595 write(1, "{\n \"package\": \"com.xxx\",\n "..., 816819 <unfinished ...>
[pid 2095] 0.000396 <... epoll_wait resumed> [{EPOLLIN, {u32=14, u64=14}}], 1024, -1) = 1
[pid 2112] 0.000043 <... write resumed> ) = 219264
[pid 2095] 0.000090 read(14, <unfinished ...>
[pid 2112] 0.000123 write(2, " \n", 2 <unfinished ...>
```
Changing exec to spawn and logging the stdout reads showed 3 64K reads and one 22K read before the stream closed:
```js
var json = spawn(pbjs_cmd, [path.normalize(targetFolder + 'proto-xxx.proto'), '--target=json']);
json.stdout.on('data', function(data) { file.write(data); console.log(data.length);});
json.stdout.on('end', function(data) { file.end(); console.log('End'); });
json.stderr.on('data', function(data) { console.log(data.toString()); });
json.on('exit', function(code) { console.log('Exit: ' + code ); });
```
```
Processing: app/generated/proto/proto-xxx.proto ...
65536
65536
65536
22656
pbjs OK Converted 1 source files to json (816819 bytes, 125 ms)
End
Exit: 0
```
If I simply use `child_process.spawnSync` and use the pbjs file output option rather than reading from stdin, the correct output is produced.
|
1.0
|
Output data lost from spawned process if process ends before all data read - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: v6.9.1
* **Platform**: Linux bld-1604 4.3.0 BrandZ virtual linux x86_64 x86_64 x86_64 GNU/Linux (Ubuntu 16.04)
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
We have a gulp build step that converts a protobuf file to JSON using a tool 'pbjs'. We are seeing a truncated output file. This problem only appears on Linux, Windows builds aren't affected.
The conversion is called thus:
```js
gulp.task('create-proto-json', ['prepare-proto-sources'], function () {
var pbjs_cmd = path.resolve('node_modules/protobufjs/bin/pbjs');
return gulp.src(targetFolder + 'proto-xxx.proto')
.pipe(exec('node "' + pbjs_cmd + '" ' + path.normalize(targetFolder + 'proto-xxx.proto') + ' --target=json', { pipeStdout: true, maxBuffer: 1000000 }))
.pipe(rename({extname: '.json'}))
.pipe(gulp.dest(targetFolder));
});
```
The conversion generates over 800K of output, but only 219K are written to the output file. Running strace on the command give some hint:
```
[pid 2112] 0.002595 write(1, "{\n \"package\": \"com.xxx\",\n "..., 816819 <unfinished ...>
[pid 2095] 0.000396 <... epoll_wait resumed> [{EPOLLIN, {u32=14, u64=14}}], 1024, -1) = 1
[pid 2112] 0.000043 <... write resumed> ) = 219264
[pid 2095] 0.000090 read(14, <unfinished ...>
[pid 2112] 0.000123 write(2, " \n", 2 <unfinished ...>
```
Changing exec to spawn and logging the stdout reads showed 3 64K reads and one 22K read before the stream closed:
```js
var json = spawn(pbjs_cmd, [path.normalize(targetFolder + 'proto-xxx.proto'), '--target=json']);
json.stdout.on('data', function(data) { file.write(data); console.log(data.length);});
json.stdout.on('end', function(data) { file.end(); console.log('End'); });
json.stderr.on('data', function(data) { console.log(data.toString()); });
json.on('exit', function(code) { console.log('Exit: ' + code ); });
```
```
Processing: app/generated/proto/proto-xxx.proto ...
65536
65536
65536
22656
pbjs OK Converted 1 source files to json (816819 bytes, 125 ms)
End
Exit: 0
```
If I simply use `child_process.spawnSync` and use the pbjs file output option rather than reading from stdin, the correct output is produced.
|
process
|
output data lost from spawned process if process ends before all data read thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform linux bld brandz virtual linux gnu linux ubuntu subsystem child process we have a gulp build step that converts a protobuf file to json using a tool pbjs we are seeing a truncated output file this problem only appears on linux windows builds aren t affected the conversion is called thus js gulp task create proto json function var pbjs cmd path resolve node modules protobufjs bin pbjs return gulp src targetfolder proto xxx proto pipe exec node pbjs cmd path normalize targetfolder proto xxx proto target json pipestdout true maxbuffer pipe rename extname json pipe gulp dest targetfolder the conversion generates over of output but only are written to the output file running strace on the command give some hint write n package com xxx n read write n changing exec to spawn and logging the stdout reads showed reads and one read before the stream closed js var json spawn pbjs cmd json stdout on data function data file write data console log data length json stdout on end function data file end console log end json stderr on data function data console log data tostring json on exit function code console log exit code processing app generated proto proto xxx proto pbjs ok converted source files to json bytes ms end exit if i simply use child process spawnsync and use the pbjs file output option rather than reading from stdin the correct output is produced
| 1
|
3,422
| 6,524,848,260
|
IssuesEvent
|
2017-08-29 14:04:43
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
do not automatically create *_set_auxiliary_variables.m on preprocessor run
|
enhancement preprocessor
|
As `*_set_auxiliary_variables.m` is sometimes empty, it does not make sense to create it on every run. Only create it if something will be written to it.
This change requires either a flag in `M_` to be tested every time the function is called in the code or to test `exist('./*set_auxiliary_variables.m') == 2` and take the appropriate action in the code.
|
1.0
|
do not automatically create *_set_auxiliary_variables.m on preprocessor run - As `*_set_auxiliary_variables.m` is sometimes empty, it does not make sense to create it on every run. Only create it if something will be written to it.
This change requires either a flag in `M_` to be tested every time the function is called in the code or to test `exist('./*set_auxiliary_variables.m') == 2` and take the appropriate action in the code.
|
process
|
do not automatically create set auxiliary variables m on preprocessor run as set auxiliary variables m is sometimes empty it does not make sense to create it on every run only create it if something will be written to it this change requires either a flag in m to be tested every time the function is called in the code or to test exist set auxiliary variables m and take the appropriate action in the code
| 1
|
703
| 3,198,655,173
|
IssuesEvent
|
2015-10-01 13:27:23
|
hammerlab/pileup.js
|
https://api.github.com/repos/hammerlab/pileup.js
|
closed
|
Update public demo
|
process
|
It's missing the new location & scale tracks and it has a few CSS issues:
http://www.hammerlab.org/pileup/
|
1.0
|
Update public demo - It's missing the new location & scale tracks and it has a few CSS issues:
http://www.hammerlab.org/pileup/
|
process
|
update public demo it s missing the new location scale tracks and it has a few css issues
| 1
|
71,274
| 13,637,698,339
|
IssuesEvent
|
2020-09-25 08:15:28
|
internelp/appgao-comment
|
https://api.github.com/repos/internelp/appgao-comment
|
opened
|
VS Code - 微软出品跨平台的免费代码编辑器 | 应用侠软件下载
|
/Programming/Visual-Studio-Code.html Gitalk
|
https://www.appgao.com/Programming/Visual-Studio-Code.html
VS Code 全称 Visual Studio Code,是微软推出的一款免费的、开源的、高性能的、跨平台的、轻量级的代码编辑器,号称微软最好的开源软件作品。VS Code 有一个扩展和主题市...
|
1.0
|
VS Code - 微软出品跨平台的免费代码编辑器 | 应用侠软件下载 - https://www.appgao.com/Programming/Visual-Studio-Code.html
VS Code 全称 Visual Studio Code,是微软推出的一款免费的、开源的、高性能的、跨平台的、轻量级的代码编辑器,号称微软最好的开源软件作品。VS Code 有一个扩展和主题市...
|
non_process
|
vs code 微软出品跨平台的免费代码编辑器 应用侠软件下载 vs code 全称 visual studio code,是微软推出的一款免费的、开源的、高性能的、跨平台的、轻量级的代码编辑器,号称微软最好的开源软件作品。vs code 有一个扩展和主题市
| 0
|
17,217
| 22,826,387,639
|
IssuesEvent
|
2022-07-12 08:57:40
|
camunda/zeebe-process-test
|
https://api.github.com/repos/camunda/zeebe-process-test
|
closed
|
Extend record logger with start instructions
|
kind/feature team/process-automation
|
**Description**
Log information about start instructions in record logging
|
1.0
|
Extend record logger with start instructions - **Description**
Log information about start instructions in record logging
|
process
|
extend record logger with start instructions description log information about start instructions in record logging
| 1
|
10,574
| 13,385,909,055
|
IssuesEvent
|
2020-09-02 14:04:11
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
An uncaught exception is thrown when using fetch, Request and a data: URL
|
AREA: client SYSTEM: client side processing TYPE: bug
|
<!--
If you have all reproduction steps with a complete sample app, please share as many details as possible in the sections below.
Make sure that you tried using the latest Hammerhead version (https://github.com/DevExpress/testcafe-hammerhead/releases), where this behavior might have been already addressed.
Before submitting an issue, please check existing issues in this repository (https://github.com/DevExpress/testcafe-hammerhead/issues) in case a similar issue exists or was already addressed. This may save your time (and ours).
-->
### What is your Scenario?
Test a page that uses Web Workers and `fetch` with a `data:` URL.
### What is the Current behavior?
An uncaught error is thrown:
```
worker-hammerhead.js:1 Uncaught (in promise) TypeError: Cannot read property 'destUrl' of null
at Function.t._sameOriginCheck (worker-hammerhead.js:1)
at r.fetch (worker-hammerhead.js:1)
```
### What is the Expected behavior?
Ther should be no error (as it was in TestCafe 1.9.1).
### Sample code
##### test.html
```html
<html>
<head>
<title></title>
</head>
<body id="hehe">
<input id="hehe" type="email"/>
<script>
new Worker('w.js');
</script>
</body>
</html>
```
##### w.js
```js
(async () => {
console.log(await fetch(new Request('data:text/javascript;base64,KGFzeW5jICgpID0+IHsKICAgIGNvbnNvbGUubG9nKGF3YWl0IGZldGNoKCdodHRwczovL2FwaS5naXRodWIuY29tL3VzZXJzL29jdG9jYXQnKSkKfSkoKQ==')))
})()
```
### Steps to Reproduce:
<!-- Describe what we should do to reproduce the behavior you encountered. -->
1. Go to: ...
2. Execute this command: ...
3. See the error: ...
### Your Environment details:
* node.js version: 12.0.0
* browser name and version: Chrome 84
* platform and version: Windows 10
* other: <!-- any notes you consider important -->
|
1.0
|
An uncaught exception is thrown when using fetch, Request and a data: URL - <!--
If you have all reproduction steps with a complete sample app, please share as many details as possible in the sections below.
Make sure that you tried using the latest Hammerhead version (https://github.com/DevExpress/testcafe-hammerhead/releases), where this behavior might have been already addressed.
Before submitting an issue, please check existing issues in this repository (https://github.com/DevExpress/testcafe-hammerhead/issues) in case a similar issue exists or was already addressed. This may save your time (and ours).
-->
### What is your Scenario?
Test a page that uses Web Workers and `fetch` with a `data:` URL.
### What is the Current behavior?
An uncaught error is thrown:
```
worker-hammerhead.js:1 Uncaught (in promise) TypeError: Cannot read property 'destUrl' of null
at Function.t._sameOriginCheck (worker-hammerhead.js:1)
at r.fetch (worker-hammerhead.js:1)
```
### What is the Expected behavior?
Ther should be no error (as it was in TestCafe 1.9.1).
### Sample code
##### test.html
```html
<html>
<head>
<title></title>
</head>
<body id="hehe">
<input id="hehe" type="email"/>
<script>
new Worker('w.js');
</script>
</body>
</html>
```
##### w.js
```js
(async () => {
console.log(await fetch(new Request('data:text/javascript;base64,KGFzeW5jICgpID0+IHsKICAgIGNvbnNvbGUubG9nKGF3YWl0IGZldGNoKCdodHRwczovL2FwaS5naXRodWIuY29tL3VzZXJzL29jdG9jYXQnKSkKfSkoKQ==')))
})()
```
### Steps to Reproduce:
<!-- Describe what we should do to reproduce the behavior you encountered. -->
1. Go to: ...
2. Execute this command: ...
3. See the error: ...
### Your Environment details:
* node.js version: 12.0.0
* browser name and version: Chrome 84
* platform and version: Windows 10
* other: <!-- any notes you consider important -->
|
process
|
an uncaught exception is thrown when using fetch request and a data url if you have all reproduction steps with a complete sample app please share as many details as possible in the sections below make sure that you tried using the latest hammerhead version where this behavior might have been already addressed before submitting an issue please check existing issues in this repository in case a similar issue exists or was already addressed this may save your time and ours what is your scenario test a page that uses web workers and fetch with a data url what is the current behavior an uncaught error is thrown worker hammerhead js uncaught in promise typeerror cannot read property desturl of null at function t sameorigincheck worker hammerhead js at r fetch worker hammerhead js what is the expected behavior ther should be no error as it was in testcafe sample code test html html new worker w js w js js async console log await fetch new request data text javascript steps to reproduce go to execute this command see the error your environment details node js version browser name and version chrome platform and version windows other
| 1
|
183,613
| 14,947,216,814
|
IssuesEvent
|
2021-01-26 08:18:17
|
YumiZhan/OSL
|
https://api.github.com/repos/YumiZhan/OSL
|
reopened
|
说明书和一些增强功能(休停一段时间,记一下下期进度)
|
documentation
|
- [x] 在readme.md写上代码说明书
- [x] 支持从超出原矩阵的行lct_i和列lct_j插入(insert)
- [x] 重载的复制构造函数:生成原矩阵的子阵或以其为基础扩充的矩阵
|
1.0
|
说明书和一些增强功能(休停一段时间,记一下下期进度) - - [x] 在readme.md写上代码说明书
- [x] 支持从超出原矩阵的行lct_i和列lct_j插入(insert)
- [x] 重载的复制构造函数:生成原矩阵的子阵或以其为基础扩充的矩阵
|
non_process
|
说明书和一些增强功能(休停一段时间,记一下下期进度) 在readme md写上代码说明书 支持从超出原矩阵的行lct i和列lct j插入 insert 重载的复制构造函数:生成原矩阵的子阵或以其为基础扩充的矩阵
| 0
|
345
| 2,793,273,428
|
IssuesEvent
|
2015-05-11 09:49:40
|
ecodistrict/IDSSDashboard
|
https://api.github.com/repos/ecodistrict/IDSSDashboard
|
closed
|
Show the initial benchmarks
|
enhancement form feedback 09102014 process step: assess alternatives
|
Show the initial benchmarks because they are not always 100% exact, sometimes it is a reference value marked by a stakeholder that is not fitting with the point of view of another stakeholder.
|
1.0
|
Show the initial benchmarks - Show the initial benchmarks because they are not always 100% exact, sometimes it is a reference value marked by a stakeholder that is not fitting with the point of view of another stakeholder.
|
process
|
show the initial benchmarks show the initial benchmarks because they are not always exact sometimes it is a reference value marked by a stakeholder that is not fitting with the point of view of another stakeholder
| 1
|
149,312
| 5,716,296,324
|
IssuesEvent
|
2017-04-19 14:52:42
|
OperationCode/operationcode_frontend
|
https://api.github.com/repos/OperationCode/operationcode_frontend
|
closed
|
Create social media react component
|
Priority: Medium Status: Available Type: Feature
|
Create a react component to display our social media icons.
This component should:
- [ ] Be configured by a hash
- [ ] Work on mobile and destop
- [ ] Use the icons in the screenshot below

|
1.0
|
Create social media react component - Create a react component to display our social media icons.
This component should:
- [ ] Be configured by a hash
- [ ] Work on mobile and destop
- [ ] Use the icons in the screenshot below

|
non_process
|
create social media react component create a react component to display our social media icons this component should be configured by a hash work on mobile and destop use the icons in the screenshot below
| 0
|
24,758
| 17,694,295,504
|
IssuesEvent
|
2021-08-24 13:45:32
|
gnosis/safe-ios
|
https://api.github.com/repos/gnosis/safe-ios
|
closed
|
Release 3.0.0
|
infrastructure
|
- [x] Create a release task in GitHub using the “New Release” template.
- [x] Create and push the release branch
```
git checkout main -b release/3.0.0
git push -u origin release/3.0.0
```
- [x] Marketing version is updated (3.0.0)
```
agvtool new-marketing-version 3.0.0
```
- [x] Notify QA
- [x] QA approved release candidate build
- [x] Product Owner approved submission
**AFTER PRODUCT OWNER APPROVAL**
- [x] Update screenshots in the App Store
- [x] Submit to the App Store Review with developer approval for distribution
- [x] Notify the team that release was submitted using the template below:
```
@here Hi everyone! We have submitted new iOS app v3.0.0 for review to the App Store.
```
- [x] Create a new release in GitHub with release notes. This will create a tag. The tag should be in a format v3.0.0
#### Download DSYMs manually
- [x] dSYMs are downloaded from AppStoreConnect and uploaded to Firebase Crashlytics.
```
# For the Multisig app (App Store version):
> ./bin/upload-symbols \
-gsp Multisig/Cross-layer/Analytics/Firebase/GoogleService-Info.Production.Mainnet.plist \
-p ios /path/to/dSYMs
```
#### Or download DSYMs with the script
- Install fastlane with `gem install fastlane --verbose`
- Set up the `fastlane` directory with configuraiton (ask team member to help). Do not commit the directory to the repository.
- Change the build version and build number in the `fastlane/upload_dsyms.sh` file
- Run the script `sh fastlane/upload_dsyms.sh`
#### Finally
- [x] Release the app when it is approved by the App Store Review team (do not release on Thu/Fri). Notify the team using the following template:
```
@here Hi everyone! We have released the iOS app v3.0.0 to the App Store and it will soon be available for download.
```
- [x] Merge the release branch to master branch via new pull-request
|
1.0
|
Release 3.0.0 - - [x] Create a release task in GitHub using the “New Release” template.
- [x] Create and push the release branch
```
git checkout main -b release/3.0.0
git push -u origin release/3.0.0
```
- [x] Marketing version is updated (3.0.0)
```
agvtool new-marketing-version 3.0.0
```
- [x] Notify QA
- [x] QA approved release candidate build
- [x] Product Owner approved submission
**AFTER PRODUCT OWNER APPROVAL**
- [x] Update screenshots in the App Store
- [x] Submit to the App Store Review with developer approval for distribution
- [x] Notify the team that release was submitted using the template below:
```
@here Hi everyone! We have submitted new iOS app v3.0.0 for review to the App Store.
```
- [x] Create a new release in GitHub with release notes. This will create a tag. The tag should be in a format v3.0.0
#### Download DSYMs manually
- [x] dSYMs are downloaded from AppStoreConnect and uploaded to Firebase Crashlytics.
```
# For the Multisig app (App Store version):
> ./bin/upload-symbols \
-gsp Multisig/Cross-layer/Analytics/Firebase/GoogleService-Info.Production.Mainnet.plist \
-p ios /path/to/dSYMs
```
#### Or download DSYMs with the script
- Install fastlane with `gem install fastlane --verbose`
- Set up the `fastlane` directory with configuraiton (ask team member to help). Do not commit the directory to the repository.
- Change the build version and build number in the `fastlane/upload_dsyms.sh` file
- Run the script `sh fastlane/upload_dsyms.sh`
#### Finally
- [x] Release the app when it is approved by the App Store Review team (do not release on Thu/Fri). Notify the team using the following template:
```
@here Hi everyone! We have released the iOS app v3.0.0 to the App Store and it will soon be available for download.
```
- [x] Merge the release branch to master branch via new pull-request
|
non_process
|
release create a release task in github using the “new release” template create and push the release branch git checkout main b release git push u origin release marketing version is updated agvtool new marketing version notify qa qa approved release candidate build product owner approved submission after product owner approval update screenshots in the app store submit to the app store review with developer approval for distribution notify the team that release was submitted using the template below here hi everyone we have submitted new ios app for review to the app store create a new release in github with release notes this will create a tag the tag should be in a format download dsyms manually dsyms are downloaded from appstoreconnect and uploaded to firebase crashlytics for the multisig app app store version bin upload symbols gsp multisig cross layer analytics firebase googleservice info production mainnet plist p ios path to dsyms or download dsyms with the script install fastlane with gem install fastlane verbose set up the fastlane directory with configuraiton ask team member to help do not commit the directory to the repository change the build version and build number in the fastlane upload dsyms sh file run the script sh fastlane upload dsyms sh finally release the app when it is approved by the app store review team do not release on thu fri notify the team using the following template here hi everyone we have released the ios app to the app store and it will soon be available for download merge the release branch to master branch via new pull request
| 0
|
74,355
| 3,438,640,771
|
IssuesEvent
|
2015-12-14 02:26:05
|
Treverr/inSparkle
|
https://api.github.com/repos/Treverr/inSparkle
|
opened
|
Vacation Time Requests
|
medium priority
|
Area of the app wanted for employees to request vacation time be added to their next paycheck.
|
1.0
|
Vacation Time Requests - Area of the app wanted for employees to request vacation time be added to their next paycheck.
|
non_process
|
vacation time requests area of the app wanted for employees to request vacation time be added to their next paycheck
| 0
|
17,552
| 23,367,058,043
|
IssuesEvent
|
2022-08-10 16:13:52
|
googleapis/python-storage
|
https://api.github.com/repos/googleapis/python-storage
|
closed
|
refactor flaky system tests
|
api: storage type: process priority: p2
|
From Kokoro failures [[1](https://source.cloud.google.com/results/invocations/7f28eb9b-b5cc-4d29-9e86-e081f44f140e/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fsystem-3.8/log), [2](https://source.cloud.google.com/results/invocations/54db59e4-152c-4250-baa4-c0047555b197/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fsystem-3.8/log), [3](https://source.cloud.google.com/results/invocations/6fffd615-73d1-4d6e-bedb-3f0fab6ce0ce/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fcontinuous%2Fcontinuous/log), [4](https://source.cloud.google.com/results/invocations/60713d87-e622-4a3c-b5fc-f140f22394b2/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fcontinuous%2Fcontinuous/log)]
- test_bucket_w_default_event_based_hold
- test_bucket_w_retention_period
- test_blob_compose_w_source_generation_match
ref: tracking in [internal doc](https://docs.google.com/document/d/1wVTnXzbJ21qCPFruC24U5LNxXbJPuSxmiy2LQcFERoo/edit?usp=sharing)
|
1.0
|
refactor flaky system tests - From Kokoro failures [[1](https://source.cloud.google.com/results/invocations/7f28eb9b-b5cc-4d29-9e86-e081f44f140e/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fsystem-3.8/log), [2](https://source.cloud.google.com/results/invocations/54db59e4-152c-4250-baa4-c0047555b197/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fpresubmit%2Fsystem-3.8/log), [3](https://source.cloud.google.com/results/invocations/6fffd615-73d1-4d6e-bedb-3f0fab6ce0ce/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fcontinuous%2Fcontinuous/log), [4](https://source.cloud.google.com/results/invocations/60713d87-e622-4a3c-b5fc-f140f22394b2/targets/cloud-devrel%2Fclient-libraries%2Fpython%2Fgoogleapis%2Fpython-storage%2Fcontinuous%2Fcontinuous/log)]
- test_bucket_w_default_event_based_hold
- test_bucket_w_retention_period
- test_blob_compose_w_source_generation_match
ref: tracking in [internal doc](https://docs.google.com/document/d/1wVTnXzbJ21qCPFruC24U5LNxXbJPuSxmiy2LQcFERoo/edit?usp=sharing)
|
process
|
refactor flaky system tests from kokoro failures test bucket w default event based hold test bucket w retention period test blob compose w source generation match ref tracking in
| 1
|
17,756
| 23,671,207,022
|
IssuesEvent
|
2022-08-27 11:34:57
|
googleapis/python-audit-log
|
https://api.github.com/repos/googleapis/python-audit-log
|
closed
|
Dependency Dashboard
|
type: process
|
This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more.
This repository currently has no open or pending branches.
## Detected dependencies
<details><summary>dockerfile</summary>
<blockquote>
<details><summary>.kokoro/docker/docs/Dockerfile</summary>
- `ubuntu 22.04`
</details>
</blockquote>
</details>
<details><summary>pip_setup</summary>
<blockquote>
<details><summary>setup.py</summary>
- `protobuf >= 3.6.0, <5.0.0dev`
- `googleapis-common-protos >= 1.56.2, < 2.0dev`
</details>
</blockquote>
</details>
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more.
This repository currently has no open or pending branches.
## Detected dependencies
<details><summary>dockerfile</summary>
<blockquote>
<details><summary>.kokoro/docker/docs/Dockerfile</summary>
- `ubuntu 22.04`
</details>
</blockquote>
</details>
<details><summary>pip_setup</summary>
<blockquote>
<details><summary>setup.py</summary>
- `protobuf >= 3.6.0, <5.0.0dev`
- `googleapis-common-protos >= 1.56.2, < 2.0dev`
</details>
</blockquote>
</details>
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue lists renovate updates and detected dependencies read the docs to learn more this repository currently has no open or pending branches detected dependencies dockerfile kokoro docker docs dockerfile ubuntu pip setup setup py protobuf googleapis common protos check this box to trigger a request for renovate to run again on this repository
| 1
|
88,415
| 25,399,087,827
|
IssuesEvent
|
2022-11-22 10:43:49
|
homuler/MediaPipeUnityPlugin
|
https://api.github.com/repos/homuler/MediaPipeUnityPlugin
|
closed
|
Build fails on Windows
|
type:build/install
|
### Plugin Version or Commit ID
v.0.6.2
### Unity Version
2021.3.8f1
### Your Host OS
Windows 10 Home
### Target Platform
Windows Standalone
### [Windows Only] Visual Studio C++ and Windows SDK Version
Visual Studio C++:14.29.30133
Windows SDK Version:10.0.19041.0
### [Linux Only] GCC/G++ and GLIBC Version
_No response_
### [Android Only] Android Build Tools and NDK Version
_No response_
### [iOS Only] XCode Version
_No response_
### Command Sequences
set BAZEL_VS=C:\Program Files (x86)\Microsoft Visual Studio\2019\Community
set BAZEL_VC=C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC
set BAZEL_VC_FULL_VERSION=14.29.30133
set BAZEL_WINSDK_FULL_VERSION=10.0.19041.0
virtualenv env
env\Scripts\activate
set PYTHON_BIN_PATH=C:\test\MediaPipeUnityPlugin-0.6.2\env\Scripts\python.exe
python build.py build --desktop cpu --include_opencv_libs -v
### Log
DEBUG: Rule 'rules_cc' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "98525a5a98b72d17ae37c01f6f70b2c5d31a8d6582e65c44410da4c92b17a669"
DEBUG: Repository rules_cc instantiated at:
C:/test/mediapipeunityplugin-0.6.2/WORKSPACE:78:13: in <toplevel>
Repository rule http_archive defined at:
C:/_bzl/v7eeodun/external/bazel_tools/tools/build_defs/repo/http.bzl:355:31: in <toplevel>
ERROR: Traceback (most recent call last):
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/ios_rules.bzl", line 529, column 58, in <toplevel>
ios_application = rule_factory.create_apple_bundling_rule(
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/rule_factory.bzl", line 950, column 55, in _create_apple_bundling_rule
rule_attrs.append(_common_binary_linking_attrs(rule_descriptor))
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/rule_factory.bzl", line 228, column 21, in _common_binary_linking_attrs
apple_common.objc_proto_aspect,
Error: 'apple_common' value has no field or method 'objc_proto_aspect'
ERROR: Analysis of target '//mediapipe_api:mediapipe_desktop' failed; build aborted: error loading package '@com_google_mediapipe//mediapipe/gpu': at C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/ios.bzl:37:5: initialization of module 'apple/internal/ios_rules.bzl' failed
INFO: Elapsed time: 17.348s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (2 packages loaded, 418 targets configured)
currently loading: @com_google_mediapipe//mediapipe/gpu
Traceback (most recent call last):
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 392, in <module>
Argument().command().run()
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 135, in run
self._run_command(self._build_desktop_commands())
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 51, in _run_command
return subprocess.run(command_list, check=True)
File "C:\Users\40943\AppData\Local\Programs\Python\Python310\lib\subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['bazel', '--output_user_root', 'C:/_bzl', 'build', '-c', 'opt', '--action_env', 'PYTHON_BIN_PATH=C://test//MediaPipeUnityPlugin-0.6.2//env//Scripts//python.exe', '--define', 'MEDIAPIPE_DISABLE_GPU=1', '--@opencv//:switch=local', '//mediapipe_api:mediapipe_desktop']' returned non-zero exit status 1.

### Additional Context
_No response_
|
1.0
|
Build fails on Windows - ### Plugin Version or Commit ID
v.0.6.2
### Unity Version
2021.3.8f1
### Your Host OS
Windows 10 Home
### Target Platform
Windows Standalone
### [Windows Only] Visual Studio C++ and Windows SDK Version
Visual Studio C++:14.29.30133
Windows SDK Version:10.0.19041.0
### [Linux Only] GCC/G++ and GLIBC Version
_No response_
### [Android Only] Android Build Tools and NDK Version
_No response_
### [iOS Only] XCode Version
_No response_
### Command Sequences
set BAZEL_VS=C:\Program Files (x86)\Microsoft Visual Studio\2019\Community
set BAZEL_VC=C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC
set BAZEL_VC_FULL_VERSION=14.29.30133
set BAZEL_WINSDK_FULL_VERSION=10.0.19041.0
virtualenv env
env\Scripts\activate
set PYTHON_BIN_PATH=C:\test\MediaPipeUnityPlugin-0.6.2\env\Scripts\python.exe
python build.py build --desktop cpu --include_opencv_libs -v
### Log
DEBUG: Rule 'rules_cc' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "98525a5a98b72d17ae37c01f6f70b2c5d31a8d6582e65c44410da4c92b17a669"
DEBUG: Repository rules_cc instantiated at:
C:/test/mediapipeunityplugin-0.6.2/WORKSPACE:78:13: in <toplevel>
Repository rule http_archive defined at:
C:/_bzl/v7eeodun/external/bazel_tools/tools/build_defs/repo/http.bzl:355:31: in <toplevel>
ERROR: Traceback (most recent call last):
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/ios_rules.bzl", line 529, column 58, in <toplevel>
ios_application = rule_factory.create_apple_bundling_rule(
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/rule_factory.bzl", line 950, column 55, in _create_apple_bundling_rule
rule_attrs.append(_common_binary_linking_attrs(rule_descriptor))
File "C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/internal/rule_factory.bzl", line 228, column 21, in _common_binary_linking_attrs
apple_common.objc_proto_aspect,
Error: 'apple_common' value has no field or method 'objc_proto_aspect'
ERROR: Analysis of target '//mediapipe_api:mediapipe_desktop' failed; build aborted: error loading package '@com_google_mediapipe//mediapipe/gpu': at C:/_bzl/v7eeodun/external/build_bazel_rules_apple/apple/ios.bzl:37:5: initialization of module 'apple/internal/ios_rules.bzl' failed
INFO: Elapsed time: 17.348s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (2 packages loaded, 418 targets configured)
currently loading: @com_google_mediapipe//mediapipe/gpu
Traceback (most recent call last):
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 392, in <module>
Argument().command().run()
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 135, in run
self._run_command(self._build_desktop_commands())
File "C:\test\MediaPipeUnityPlugin-0.6.2\build.py", line 51, in _run_command
return subprocess.run(command_list, check=True)
File "C:\Users\40943\AppData\Local\Programs\Python\Python310\lib\subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['bazel', '--output_user_root', 'C:/_bzl', 'build', '-c', 'opt', '--action_env', 'PYTHON_BIN_PATH=C://test//MediaPipeUnityPlugin-0.6.2//env//Scripts//python.exe', '--define', 'MEDIAPIPE_DISABLE_GPU=1', '--@opencv//:switch=local', '//mediapipe_api:mediapipe_desktop']' returned non-zero exit status 1.

### Additional Context
_No response_
|
non_process
|
build fails on windows plugin version or commit id v unity version your host os windows home target platform windows standalone visual studio c and windows sdk version visual studio c windows sdk version gcc g and glibc version no response android build tools and ndk version no response xcode version no response command sequences set bazel vs c program files microsoft visual studio community set bazel vc c program files microsoft visual studio community vc set bazel vc full version set bazel winsdk full version virtualenv env env scripts activate set python bin path c test mediapipeunityplugin env scripts python exe python build py build desktop cpu include opencv libs v log debug rule rules cc indicated that a canonical reproducible form can be obtained by modifying arguments debug repository rules cc instantiated at c test mediapipeunityplugin workspace in repository rule http archive defined at c bzl external bazel tools tools build defs repo http bzl in error traceback most recent call last file c bzl external build bazel rules apple apple internal ios rules bzl line column in ios application rule factory create apple bundling rule file c bzl external build bazel rules apple apple internal rule factory bzl line column in create apple bundling rule rule attrs append common binary linking attrs rule descriptor file c bzl external build bazel rules apple apple internal rule factory bzl line column in common binary linking attrs apple common objc proto aspect error apple common value has no field or method objc proto aspect error analysis of target mediapipe api mediapipe desktop failed build aborted error loading package com google mediapipe mediapipe gpu at c bzl external build bazel rules apple apple ios bzl initialization of module apple internal ios rules bzl failed info elapsed time info processes failed build did not complete successfully packages loaded targets configured currently loading com google mediapipe mediapipe gpu traceback most recent call last file c test mediapipeunityplugin build py line in argument command run file c test mediapipeunityplugin build py line in run self run command self build desktop commands file c test mediapipeunityplugin build py line in run command return subprocess run command list check true file c users appdata local programs python lib subprocess py line in run raise calledprocesserror retcode process args subprocess calledprocesserror command returned non zero exit status additional context no response
| 0
|
2,769
| 5,704,883,878
|
IssuesEvent
|
2017-04-18 06:43:11
|
RITficialIntelligence/kaggle-cancer
|
https://api.github.com/repos/RITficialIntelligence/kaggle-cancer
|
closed
|
Implement NV classification
|
in progress preprocessing
|
Implement the model we decide to use for night vision classification
|
1.0
|
Implement NV classification - Implement the model we decide to use for night vision classification
|
process
|
implement nv classification implement the model we decide to use for night vision classification
| 1
|
147,983
| 19,526,242,259
|
IssuesEvent
|
2021-12-30 08:23:03
|
panasalap/linux-4.1.15
|
https://api.github.com/repos/panasalap/linux-4.1.15
|
opened
|
CVE-2018-10938 (Medium) detected in linux-stable-rtv4.1.33
|
security vulnerability
|
## CVE-2018-10938 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/9c15ec31637ff4ee4a4c14fb9b3264a31f75aa69">9c15ec31637ff4ee4a4c14fb9b3264a31f75aa69</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv4/cipso_ipv4.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv4/cipso_ipv4.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel present since v4.0-rc1 and through v4.13-rc4. A crafted network packet sent remotely by an attacker may force the kernel to enter an infinite loop in the cipso_v4_optptr() function in net/ipv4/cipso_ipv4.c leading to a denial-of-service. A certain non-default configuration of LSM (Linux Security Module) and NetLabel should be set up on a system before an attacker could leverage this flaw.
<p>Publish Date: 2018-08-27
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-10938>CVE-2018-10938</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.securitytracker.com/id/1041569">http://www.securitytracker.com/id/1041569</a></p>
<p>Fix Resolution: The vendor has issued a source code fix, available at:
https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=40413955ee265a5e42f710940ec78f5450d49149</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-10938 (Medium) detected in linux-stable-rtv4.1.33 - ## CVE-2018-10938 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/9c15ec31637ff4ee4a4c14fb9b3264a31f75aa69">9c15ec31637ff4ee4a4c14fb9b3264a31f75aa69</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv4/cipso_ipv4.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv4/cipso_ipv4.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel present since v4.0-rc1 and through v4.13-rc4. A crafted network packet sent remotely by an attacker may force the kernel to enter an infinite loop in the cipso_v4_optptr() function in net/ipv4/cipso_ipv4.c leading to a denial-of-service. A certain non-default configuration of LSM (Linux Security Module) and NetLabel should be set up on a system before an attacker could leverage this flaw.
<p>Publish Date: 2018-08-27
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-10938>CVE-2018-10938</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.securitytracker.com/id/1041569">http://www.securitytracker.com/id/1041569</a></p>
<p>Fix Resolution: The vendor has issued a source code fix, available at:
https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=40413955ee265a5e42f710940ec78f5450d49149</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux stable cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files net cipso c net cipso c vulnerability details a flaw was found in the linux kernel present since and through a crafted network packet sent remotely by an attacker may force the kernel to enter an infinite loop in the cipso optptr function in net cipso c leading to a denial of service a certain non default configuration of lsm linux security module and netlabel should be set up on a system before an attacker could leverage this flaw publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href fix resolution the vendor has issued a source code fix available at step up your open source security game with whitesource
| 0
|
201,707
| 15,219,092,239
|
IssuesEvent
|
2021-02-17 18:43:21
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
`Customize sync` settings are not respected
|
OS/Desktop QA/Test-Plan-Specified QA/Yes bug feature/sync needs-discussion
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
`Customize sync` settings are not respected
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean install
2. Create a standalone sync using code words
3. Click `Manage your synced devices` and esnured standalone sync chain is created with one device
4. Ensured `Customize sync` is selected by default under `Sync Settings`
5. Select `Sync everything` from `Sync Settings`
6. Reload the page brave://settings/braveSync/setup
7. Select `Customize sync` and observe all the sync data settings are selected
## Actual result:
<!--Please add screenshots if needed-->
`Customize sync` settings are not respected

## Expected result:
After selecting `Customize sync` settings, the settings should be reverted back to default customize settings
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easy
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.22.25 Chromium: 89.0.4389.48 (Official Build) nightly (64-bit)
-- | --
Revision | 0fe3c4589a6cf5ce719d167834dfa9cd8978937a-refs/branch-heads/4389@{#873}
OS | Windows 10 OS Version 2004 (Build 19041.804)
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc: @brave/legacy_qa @AlexeyBarabash
|
1.0
|
`Customize sync` settings are not respected - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
`Customize sync` settings are not respected
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean install
2. Create a standalone sync using code words
3. Click `Manage your synced devices` and esnured standalone sync chain is created with one device
4. Ensured `Customize sync` is selected by default under `Sync Settings`
5. Select `Sync everything` from `Sync Settings`
6. Reload the page brave://settings/braveSync/setup
7. Select `Customize sync` and observe all the sync data settings are selected
## Actual result:
<!--Please add screenshots if needed-->
`Customize sync` settings are not respected

## Expected result:
After selecting `Customize sync` settings, the settings should be reverted back to default customize settings
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easy
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.22.25 Chromium: 89.0.4389.48 (Official Build) nightly (64-bit)
-- | --
Revision | 0fe3c4589a6cf5ce719d167834dfa9cd8978937a-refs/branch-heads/4389@{#873}
OS | Windows 10 OS Version 2004 (Build 19041.804)
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc: @brave/legacy_qa @AlexeyBarabash
|
non_process
|
customize sync settings are not respected have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description customize sync settings are not respected steps to reproduce clean install create a standalone sync using code words click manage your synced devices and esnured standalone sync chain is created with one device ensured customize sync is selected by default under sync settings select sync everything from sync settings reload the page brave settings bravesync setup select customize sync and observe all the sync data settings are selected actual result customize sync settings are not respected expected result after selecting customize sync settings the settings should be reverted back to default customize settings reproduces how often easy brave version brave version info brave chromium official build nightly bit revision refs branch heads os windows os version build version channel information can you reproduce this issue with the current release yes can you reproduce this issue with the beta channel yes can you reproduce this issue with the nightly channel yes other additional information does the issue resolve itself when disabling brave shields na does the issue resolve itself when disabling brave rewards na is the issue reproducible on the latest version of chrome na miscellaneous information cc brave legacy qa alexeybarabash
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.