Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
2,973
5,963,408,315
IssuesEvent
2017-05-30 04:50:28
nodejs/node
https://api.github.com/repos/nodejs/node
closed
process: rename and deprecate process.reallyExit(), optionally expose result.
process
<!-- Thanks for wanting to report an issue you've found in Node.js. Please fill in the template below by replacing the html comments with an appropriate answer. If unsure about something, just do as best as you're able. version: usually output of `node -v` platform: either `uname -a` output, or if Windows, version and 32 or 64-bit. subsystem: optional -- if known please specify affected core module name. It will be much easier for us to fix the issue if a test case that reproduces the problem is provided. Ideally this test case should not have any external dependencies. We understand that it is not always possible to reduce your code to a small test case, but we would appreciate to have as much data as possible. Thank you! --> - **Version**: all - **Platform**: all - **Subsystem**: `process` In the wake of #6456 I want to suggest to rename `process.reallyExit()` as it's name is poorly chosen. It was introduced in https://github.com/nodejs/node/commit/f6a7fe26574defaa807a13248102ebe0f23270af and imo doesn't fit the intention of the commit - the repo was young back then. An adequate name would be `os.exit()` since the exit call is implemented by the OSes differently. Optionally we could think about exposing this via documentation, since it also eases the API inconsistencies related - it would basically say, if you **really** want to exit, no matter what, call this. For deprecation purposes we could alias this one in `lib/process.js`. For API examples see other modern languages. It's also doubtful whether the use of `exit(3)` is correct in that context and `_Exit(3)` (see [here](http://man7.org/linux/man-pages/man2/_exit.2.html)) would have been more appropriate. Discussion would be more academic though, imo.
1.0
process: rename and deprecate process.reallyExit(), optionally expose result. - <!-- Thanks for wanting to report an issue you've found in Node.js. Please fill in the template below by replacing the html comments with an appropriate answer. If unsure about something, just do as best as you're able. version: usually output of `node -v` platform: either `uname -a` output, or if Windows, version and 32 or 64-bit. subsystem: optional -- if known please specify affected core module name. It will be much easier for us to fix the issue if a test case that reproduces the problem is provided. Ideally this test case should not have any external dependencies. We understand that it is not always possible to reduce your code to a small test case, but we would appreciate to have as much data as possible. Thank you! --> - **Version**: all - **Platform**: all - **Subsystem**: `process` In the wake of #6456 I want to suggest to rename `process.reallyExit()` as it's name is poorly chosen. It was introduced in https://github.com/nodejs/node/commit/f6a7fe26574defaa807a13248102ebe0f23270af and imo doesn't fit the intention of the commit - the repo was young back then. An adequate name would be `os.exit()` since the exit call is implemented by the OSes differently. Optionally we could think about exposing this via documentation, since it also eases the API inconsistencies related - it would basically say, if you **really** want to exit, no matter what, call this. For deprecation purposes we could alias this one in `lib/process.js`. For API examples see other modern languages. It's also doubtful whether the use of `exit(3)` is correct in that context and `_Exit(3)` (see [here](http://man7.org/linux/man-pages/man2/_exit.2.html)) would have been more appropriate. Discussion would be more academic though, imo.
process
process rename and deprecate process reallyexit optionally expose result thanks for wanting to report an issue you ve found in node js please fill in the template below by replacing the html comments with an appropriate answer if unsure about something just do as best as you re able version usually output of node v platform either uname a output or if windows version and or bit subsystem optional if known please specify affected core module name it will be much easier for us to fix the issue if a test case that reproduces the problem is provided ideally this test case should not have any external dependencies we understand that it is not always possible to reduce your code to a small test case but we would appreciate to have as much data as possible thank you version all platform all subsystem process in the wake of i want to suggest to rename process reallyexit as it s name is poorly chosen it was introduced in and imo doesn t fit the intention of the commit the repo was young back then an adequate name would be os exit since the exit call is implemented by the oses differently optionally we could think about exposing this via documentation since it also eases the api inconsistencies related it would basically say if you really want to exit no matter what call this for deprecation purposes we could alias this one in lib process js for api examples see other modern languages it s also doubtful whether the use of exit is correct in that context and exit see would have been more appropriate discussion would be more academic though imo
1
18,382
12,833,883,127
IssuesEvent
2020-07-07 10:03:11
virtualsatellite/VirtualSatellite4-FDIR
https://api.github.com/repos/virtualsatellite/VirtualSatellite4-FDIR
reopened
Increase border margin for auto layout
comfort/usability
The root is put too much to the top left of the canvas when hitting auto layout. For single node trees, its put so much to the top left, that the context menu "disappears".
True
Increase border margin for auto layout - The root is put too much to the top left of the canvas when hitting auto layout. For single node trees, its put so much to the top left, that the context menu "disappears".
non_process
increase border margin for auto layout the root is put too much to the top left of the canvas when hitting auto layout for single node trees its put so much to the top left that the context menu disappears
0
14,427
17,480,661,604
IssuesEvent
2021-08-09 01:13:42
googleapis/python-spanner
https://api.github.com/repos/googleapis/python-spanner
closed
samples.samples.backup_sample_test: many tests failed
api: spanner type: process samples flakybot: issue flakybot: flaky
Many tests failed at the same time in this package. * I will close this issue when there are no more failures in this package _and_ there is at least one pass. * No new issues will be filed for this package until this issue is closed. * If there are already issues for individual test cases, I will close them when the corresponding test passes. You can close them earlier, if you prefer, and I won't reopen them while this issue is still open. Here are the tests that failed: * test_create_backup (#320) * test_create_backup_with_encryption_key (#408) * test_restore_database (#321) * test_restore_database_with_encryption_key (#410) * test_list_backup_operations (#266) * test_list_backups (#322) * test_update_backup (#268) * test_delete_backup (#269) * test_cancel_backup (#270) * test_create_database_with_retention_period (#271) ----- commit: 2487800e31842a44dcc37937c325e130c8c926b0 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e2951c84-6fe7-446c-87e7-3b97256bee0b), [Sponge](http://sponge2/e2951c84-6fe7-446c-87e7-3b97256bee0b) status: failed
1.0
samples.samples.backup_sample_test: many tests failed - Many tests failed at the same time in this package. * I will close this issue when there are no more failures in this package _and_ there is at least one pass. * No new issues will be filed for this package until this issue is closed. * If there are already issues for individual test cases, I will close them when the corresponding test passes. You can close them earlier, if you prefer, and I won't reopen them while this issue is still open. Here are the tests that failed: * test_create_backup (#320) * test_create_backup_with_encryption_key (#408) * test_restore_database (#321) * test_restore_database_with_encryption_key (#410) * test_list_backup_operations (#266) * test_list_backups (#322) * test_update_backup (#268) * test_delete_backup (#269) * test_cancel_backup (#270) * test_create_database_with_retention_period (#271) ----- commit: 2487800e31842a44dcc37937c325e130c8c926b0 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e2951c84-6fe7-446c-87e7-3b97256bee0b), [Sponge](http://sponge2/e2951c84-6fe7-446c-87e7-3b97256bee0b) status: failed
process
samples samples backup sample test many tests failed many tests failed at the same time in this package i will close this issue when there are no more failures in this package and there is at least one pass no new issues will be filed for this package until this issue is closed if there are already issues for individual test cases i will close them when the corresponding test passes you can close them earlier if you prefer and i won t reopen them while this issue is still open here are the tests that failed test create backup test create backup with encryption key test restore database test restore database with encryption key test list backup operations test list backups test update backup test delete backup test cancel backup test create database with retention period commit buildurl status failed
1
4,023
6,955,699,901
IssuesEvent
2017-12-07 08:51:30
nodejs/node
https://api.github.com/repos/nodejs/node
reopened
Accidentally writing garbage to the IPC channel blows up parent in debug build
child_process libuv
<!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: master * **Platform**: Windows * **Subsystem**: child_process <!-- Enter your issue details below this comment. --> index.js ```js 'use strict' const spawn = require('child_process').spawn const path = require('path') let dir = path.join(__dirname, 'child.js') let server = spawn(process.argv0, [`"${dir}"`], { stdio: ['pipe', 'ipc', 'pipe'], shell: true, }) ``` child.js ```js // choose one to your liking // option 1 console.log('here bomb'); // option 2 console.log('here bomb hahahahahaha'); // option 3 console.log('\u0000\u0000\u0000\u0000here bomb hahahahahaha'); ``` Outputs for 1,2,3 respectively: ``` Assertion failed: avail >= sizeof(ipc_frame.header), file src\win\pipe.c, line 1590 Assertion failed: ipc_frame.header.flags <= (UV_IPC_TCP_SERVER | UV_IPC_RAW_DATA | UV_IPC_TCP_CONNECTION), file src\win\pipe.c, line 1604 Assertion failed: handle->pipe.conn.remaining_ipc_rawdata_bytes >= bytes, file src\win\pipe.c, line 1655 ``` At first I thought it's a bug in libuv, but now I'm not sure. From all the asserts, it seems it assumes the child will only write valid data. In that case, I think Node.js should either disallow using `console.log` when `stdout` is used for IPC, or disallow using `stdout` for IPC.
1.0
Accidentally writing garbage to the IPC channel blows up parent in debug build - <!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: master * **Platform**: Windows * **Subsystem**: child_process <!-- Enter your issue details below this comment. --> index.js ```js 'use strict' const spawn = require('child_process').spawn const path = require('path') let dir = path.join(__dirname, 'child.js') let server = spawn(process.argv0, [`"${dir}"`], { stdio: ['pipe', 'ipc', 'pipe'], shell: true, }) ``` child.js ```js // choose one to your liking // option 1 console.log('here bomb'); // option 2 console.log('here bomb hahahahahaha'); // option 3 console.log('\u0000\u0000\u0000\u0000here bomb hahahahahaha'); ``` Outputs for 1,2,3 respectively: ``` Assertion failed: avail >= sizeof(ipc_frame.header), file src\win\pipe.c, line 1590 Assertion failed: ipc_frame.header.flags <= (UV_IPC_TCP_SERVER | UV_IPC_RAW_DATA | UV_IPC_TCP_CONNECTION), file src\win\pipe.c, line 1604 Assertion failed: handle->pipe.conn.remaining_ipc_rawdata_bytes >= bytes, file src\win\pipe.c, line 1655 ``` At first I thought it's a bug in libuv, but now I'm not sure. From all the asserts, it seems it assumes the child will only write valid data. In that case, I think Node.js should either disallow using `console.log` when `stdout` is used for IPC, or disallow using `stdout` for IPC.
process
accidentally writing garbage to the ipc channel blows up parent in debug build thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version master platform windows subsystem child process index js js use strict const spawn require child process spawn const path require path let dir path join dirname child js let server spawn process stdio shell true child js js choose one to your liking option console log here bomb option console log here bomb hahahahahaha option console log bomb hahahahahaha outputs for respectively assertion failed avail sizeof ipc frame header file src win pipe c line assertion failed ipc frame header flags uv ipc tcp server uv ipc raw data uv ipc tcp connection file src win pipe c line assertion failed handle pipe conn remaining ipc rawdata bytes bytes file src win pipe c line at first i thought it s a bug in libuv but now i m not sure from all the asserts it seems it assumes the child will only write valid data in that case i think node js should either disallow using console log when stdout is used for ipc or disallow using stdout for ipc
1
17,565
23,377,579,713
IssuesEvent
2022-08-11 05:58:57
Battle-s/battle-school-backend
https://api.github.com/repos/Battle-s/battle-school-backend
opened
[FEAT] 게시판 기능 구현
feature :computer: processing :hourglass_flowing_sand:
## 설명 > 게시판 기능 구현 ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] controller - [ ] domain - [ ] repository - [ ] service - [ ] test ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
1.0
[FEAT] 게시판 기능 구현 - ## 설명 > 게시판 기능 구현 ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] controller - [ ] domain - [ ] repository - [ ] service - [ ] test ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
process
게시판 기능 구현 설명 게시판 기능 구현 체크사항 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다 controller domain repository service test 참고자료 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다 관련 논의 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다
1
7,512
10,591,941,526
IssuesEvent
2019-10-09 12:04:31
didi/mpx
https://api.github.com/repos/didi/mpx
closed
picker组件mode="multiSelector",动态更新渲染数据无法触发视图更新
processing
picker设置mode="multiSelector",比如省市区联动组件,改变第一列省份的时候同时改变第二列第三列(市、区)的渲染数据,但是不会视图更新 "@mpxjs/api-proxy": "^2.2.27", "@mpxjs/core": "^2.2.27", "@mpxjs/webpack-plugin": "^2.2.29", 微信头条均出现上述问题
1.0
picker组件mode="multiSelector",动态更新渲染数据无法触发视图更新 - picker设置mode="multiSelector",比如省市区联动组件,改变第一列省份的时候同时改变第二列第三列(市、区)的渲染数据,但是不会视图更新 "@mpxjs/api-proxy": "^2.2.27", "@mpxjs/core": "^2.2.27", "@mpxjs/webpack-plugin": "^2.2.29", 微信头条均出现上述问题
process
picker组件mode multiselector ,动态更新渲染数据无法触发视图更新 picker设置mode multiselector ,比如省市区联动组件,改变第一列省份的时候同时改变第二列第三列(市、区)的渲染数据,但是不会视图更新 mpxjs api proxy mpxjs core mpxjs webpack plugin 微信头条均出现上述问题
1
17,854
23,799,986,441
IssuesEvent
2022-09-03 05:52:16
usetada/status-page
https://api.github.com/repos/usetada/status-page
closed
🛑 Processor API is down
status processor-api
In [`09fc212`](https://github.com/usetada/status-page/commit/09fc212bec23714681c19d225f9c1c285d8d7805 ), Processor API ($PROCESSOR_API_URL) was **down**: - HTTP code: 502 - Response time: 3703 ms
1.0
🛑 Processor API is down - In [`09fc212`](https://github.com/usetada/status-page/commit/09fc212bec23714681c19d225f9c1c285d8d7805 ), Processor API ($PROCESSOR_API_URL) was **down**: - HTTP code: 502 - Response time: 3703 ms
process
🛑 processor api is down in processor api processor api url was down http code response time ms
1
47,782
7,346,744,330
IssuesEvent
2018-03-07 21:47:29
datawire/telepresence
https://api.github.com/repos/datawire/telepresence
opened
Report problems when using --expose with an existing deployment
bug documentation
A user [creates a Telepresence deployment manually](https://www.telepresence.io/reference/connecting#running-telepresence-manually) and launches `telepresence --deployment ...`. If any `--expose` argument in this invocation is incompatible with the existing deployment, Telepresence should report an error. At the moment, incompatibilities are ignored and no errors are reported. - [ ] When you use an existing deployment, Telepresence does not create services. As a result, if you try to `--expose` ports that are not already fronted by existing services, there will be no way to reach them. Telepresence should report this. - [ ] If you try to `--expose` a privileged port (port < 1024) and the existing deployment does not run as root, the underlying port forwarding fails silently. Telepresence should report this. - [ ] Telepresence runs as a non-root user by default to support running on OpenShift and generally work better with privilege management via Kubernetes Security Contexts. See #131 for more information. To run the Telepresence image as root, the pod spec must contain the following. The documentation should mention this. ``` yaml "securityContext": { "runAsUser": 0 } ```
1.0
Report problems when using --expose with an existing deployment - A user [creates a Telepresence deployment manually](https://www.telepresence.io/reference/connecting#running-telepresence-manually) and launches `telepresence --deployment ...`. If any `--expose` argument in this invocation is incompatible with the existing deployment, Telepresence should report an error. At the moment, incompatibilities are ignored and no errors are reported. - [ ] When you use an existing deployment, Telepresence does not create services. As a result, if you try to `--expose` ports that are not already fronted by existing services, there will be no way to reach them. Telepresence should report this. - [ ] If you try to `--expose` a privileged port (port < 1024) and the existing deployment does not run as root, the underlying port forwarding fails silently. Telepresence should report this. - [ ] Telepresence runs as a non-root user by default to support running on OpenShift and generally work better with privilege management via Kubernetes Security Contexts. See #131 for more information. To run the Telepresence image as root, the pod spec must contain the following. The documentation should mention this. ``` yaml "securityContext": { "runAsUser": 0 } ```
non_process
report problems when using expose with an existing deployment a user and launches telepresence deployment if any expose argument in this invocation is incompatible with the existing deployment telepresence should report an error at the moment incompatibilities are ignored and no errors are reported when you use an existing deployment telepresence does not create services as a result if you try to expose ports that are not already fronted by existing services there will be no way to reach them telepresence should report this if you try to expose a privileged port port and the existing deployment does not run as root the underlying port forwarding fails silently telepresence should report this telepresence runs as a non root user by default to support running on openshift and generally work better with privilege management via kubernetes security contexts see for more information to run the telepresence image as root the pod spec must contain the following the documentation should mention this yaml securitycontext runasuser
0
117,432
4,715,343,631
IssuesEvent
2016-10-15 12:53:39
benvenutti/hasm
https://api.github.com/repos/benvenutti/hasm
closed
Add version flag to command line
priority: low status: completed type: enhancement
Create a version flag for hasm in the program options. The flag could be --version (or a short -v, as is classical in command line programs).
1.0
Add version flag to command line - Create a version flag for hasm in the program options. The flag could be --version (or a short -v, as is classical in command line programs).
non_process
add version flag to command line create a version flag for hasm in the program options the flag could be version or a short v as is classical in command line programs
0
202,025
15,818,996,377
IssuesEvent
2021-04-05 16:50:50
katspaugh/wavesurfer.js
https://api.github.com/repos/katspaugh/wavesurfer.js
closed
What is going on with documentation/versions?
documentation website
Hi, It's really tough to decipher which documentation goes with which version of wavesurfer. The main docs site mentions something about version 2, but all the docs on github direct to version 1. The docs for version 2 seem autogenerated and not really up to par for using the library. And if I look at the changelog, I see now version 4? Which version do the samples use? Any help is appreciated, and if I can get an idea of what needs to change, I am happy to issue a PR or two.
1.0
What is going on with documentation/versions? - Hi, It's really tough to decipher which documentation goes with which version of wavesurfer. The main docs site mentions something about version 2, but all the docs on github direct to version 1. The docs for version 2 seem autogenerated and not really up to par for using the library. And if I look at the changelog, I see now version 4? Which version do the samples use? Any help is appreciated, and if I can get an idea of what needs to change, I am happy to issue a PR or two.
non_process
what is going on with documentation versions hi it s really tough to decipher which documentation goes with which version of wavesurfer the main docs site mentions something about version but all the docs on github direct to version the docs for version seem autogenerated and not really up to par for using the library and if i look at the changelog i see now version which version do the samples use any help is appreciated and if i can get an idea of what needs to change i am happy to issue a pr or two
0
20,477
27,135,200,746
IssuesEvent
2023-02-16 12:44:23
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
reopened
Merge SolibSymlinkAction with SymlinkAction
P3 type: process team-Remote-Exec stale
`SolibSymlinkAction` should be turned into a factory that creates a `SymlinkAction` as their `execute` method is effectively identical. The benefit of this is that we'll have one fewer native action implementation.
1.0
Merge SolibSymlinkAction with SymlinkAction - `SolibSymlinkAction` should be turned into a factory that creates a `SymlinkAction` as their `execute` method is effectively identical. The benefit of this is that we'll have one fewer native action implementation.
process
merge solibsymlinkaction with symlinkaction solibsymlinkaction should be turned into a factory that creates a symlinkaction as their execute method is effectively identical the benefit of this is that we ll have one fewer native action implementation
1
102,534
32,039,367,514
IssuesEvent
2023-09-22 17:56:53
ubccr/software-layer
https://api.github.com/repos/ubccr/software-layer
closed
Request for installation of pybedtools
build-request
We have a lot of things that use this wrapper, found here: https://pypi.org/project/pybedtools/ but at present no one computer-savvy enough to do the new module installs. thanks.
1.0
Request for installation of pybedtools - We have a lot of things that use this wrapper, found here: https://pypi.org/project/pybedtools/ but at present no one computer-savvy enough to do the new module installs. thanks.
non_process
request for installation of pybedtools we have a lot of things that use this wrapper found here but at present no one computer savvy enough to do the new module installs thanks
0
271,590
20,681,452,262
IssuesEvent
2022-03-10 14:16:42
openBackhaul/ApplicationPattern
https://api.github.com/repos/openBackhaul/ApplicationPattern
closed
Add: Branch merging: think about/establish guideline / checklists
documentation enhancement
**Is your feature request related to a problem? Please describe.** Working with the GitFlow workflow could introduce merge conflicts due to the various branches being created and merged. **Describe the solution you'd like** We might need to establish guidelines or checklists for merging the branches to ensure we can properly identify and resolve any conflicts. **Additional context** Example: How do we make sure we don't get problems due to conflicts if we e.g. create new feature branches from different versions of the develop branch (e.g. if the new feature uses some underlying functions which are have changed in the develop branch in between)? There needs to be a clear merging process. ![image](https://user-images.githubusercontent.com/57349523/157228633-f2c5353e-781e-444b-ad11-3a91726e3b52.png) **Solution and next next step(s)** First check if we are likely to run into such conflicts, if yes check if we already have ideas for how to handle those or what recommendations exist for that.
1.0
Add: Branch merging: think about/establish guideline / checklists - **Is your feature request related to a problem? Please describe.** Working with the GitFlow workflow could introduce merge conflicts due to the various branches being created and merged. **Describe the solution you'd like** We might need to establish guidelines or checklists for merging the branches to ensure we can properly identify and resolve any conflicts. **Additional context** Example: How do we make sure we don't get problems due to conflicts if we e.g. create new feature branches from different versions of the develop branch (e.g. if the new feature uses some underlying functions which are have changed in the develop branch in between)? There needs to be a clear merging process. ![image](https://user-images.githubusercontent.com/57349523/157228633-f2c5353e-781e-444b-ad11-3a91726e3b52.png) **Solution and next next step(s)** First check if we are likely to run into such conflicts, if yes check if we already have ideas for how to handle those or what recommendations exist for that.
non_process
add branch merging think about establish guideline checklists is your feature request related to a problem please describe working with the gitflow workflow could introduce merge conflicts due to the various branches being created and merged describe the solution you d like we might need to establish guidelines or checklists for merging the branches to ensure we can properly identify and resolve any conflicts additional context example how do we make sure we don t get problems due to conflicts if we e g create new feature branches from different versions of the develop branch e g if the new feature uses some underlying functions which are have changed in the develop branch in between there needs to be a clear merging process solution and next next step s first check if we are likely to run into such conflicts if yes check if we already have ideas for how to handle those or what recommendations exist for that
0
18,603
24,576,827,075
IssuesEvent
2022-10-13 12:58:51
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Auth server] unable to login using temporary password in the following scenario
Bug P0 iOS Process: Fixed Process: Tested dev Auth server
Steps: 1. Launch the mobile app 2. Go to Sign in Screen 3. Enter the valid user name and invalid password 5 times, then the account will be locked 4. Without killing the app, try to enroll by using a temporary password 5. Sign in not available screen is displaying 6. Kill the app and again try to sign in using a valid email and a valid temporary password 7. Observe AR: Following error message is getting displayed ER: Participant should be able to login into the app successfully ![image](https://user-images.githubusercontent.com/71445210/187444335-b458e6d3-8c69-4cb6-8d72-9db6ff24ff40.png)
2.0
[Auth server] unable to login using temporary password in the following scenario - Steps: 1. Launch the mobile app 2. Go to Sign in Screen 3. Enter the valid user name and invalid password 5 times, then the account will be locked 4. Without killing the app, try to enroll by using a temporary password 5. Sign in not available screen is displaying 6. Kill the app and again try to sign in using a valid email and a valid temporary password 7. Observe AR: Following error message is getting displayed ER: Participant should be able to login into the app successfully ![image](https://user-images.githubusercontent.com/71445210/187444335-b458e6d3-8c69-4cb6-8d72-9db6ff24ff40.png)
process
unable to login using temporary password in the following scenario steps launch the mobile app go to sign in screen enter the valid user name and invalid password times then the account will be locked without killing the app try to enroll by using a temporary password sign in not available screen is displaying kill the app and again try to sign in using a valid email and a valid temporary password observe ar following error message is getting displayed er participant should be able to login into the app successfully
1
128,891
18,070,258,077
IssuesEvent
2021-09-21 01:29:55
Srinivasanms16/EmployeeInformation
https://api.github.com/repos/Srinivasanms16/EmployeeInformation
opened
CVE-2021-3807 (Medium) detected in multiple libraries
security vulnerability
## CVE-2021-3807 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-2.1.1.tgz</b>, <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary> <p> <details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/string-width/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/string-width/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/cliui/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - compiler-cli-9.0.7.tgz (Root Library) - yargs-13.1.0.tgz - cliui-4.1.0.tgz - strip-ansi-4.0.0.tgz - :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-2.1.1.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/npm/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - build-angular-0.900.7.tgz (Root Library) - webpack-dev-server-3.9.0.tgz - strip-ansi-3.0.1.tgz - :x: **ansi-regex-2.1.1.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/ora/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/inquirer/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - cli-9.1.12.tgz (Root Library) - inquirer-7.1.0.tgz - strip-ansi-6.0.0.tgz - :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/npm/node_modules/cliui/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/@angular/compiler-cli/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/yargs/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - compiler-cli-9.0.7.tgz (Root Library) - yargs-13.1.0.tgz - string-width-3.1.0.tgz - strip-ansi-5.2.0.tgz - :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ansi-regex is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p> <p>Release Date: 2021-09-17</p> <p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-3807 (Medium) detected in multiple libraries - ## CVE-2021-3807 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-2.1.1.tgz</b>, <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary> <p> <details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/string-width/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/string-width/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/cliui/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - compiler-cli-9.0.7.tgz (Root Library) - yargs-13.1.0.tgz - cliui-4.1.0.tgz - strip-ansi-4.0.0.tgz - :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-2.1.1.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/npm/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - build-angular-0.900.7.tgz (Root Library) - webpack-dev-server-3.9.0.tgz - strip-ansi-3.0.1.tgz - :x: **ansi-regex-2.1.1.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/ora/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/inquirer/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - cli-9.1.12.tgz (Root Library) - inquirer-7.1.0.tgz - strip-ansi-6.0.0.tgz - :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p> <p>Path to dependency file: EmployeeInformation/package.json</p> <p>Path to vulnerable library: EmployeeInformation/node_modules/npm/node_modules/cliui/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/@angular/compiler-cli/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,EmployeeInformation/node_modules/npm/node_modules/yargs/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - compiler-cli-9.0.7.tgz (Root Library) - yargs-13.1.0.tgz - string-width-3.1.0.tgz - strip-ansi-5.2.0.tgz - :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ansi-regex is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p> <p>Release Date: 2021-09-17</p> <p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file employeeinformation package json path to vulnerable library employeeinformation node modules string width node modules ansi regex package json employeeinformation node modules npm node modules string width node modules ansi regex package json employeeinformation node modules cliui node modules ansi regex package json dependency hierarchy compiler cli tgz root library yargs tgz cliui tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file employeeinformation package json path to vulnerable library employeeinformation node modules npm node modules ansi regex package json employeeinformation node modules ansi regex package json dependency hierarchy build angular tgz root library webpack dev server tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file employeeinformation package json path to vulnerable library employeeinformation node modules ora node modules ansi regex package json employeeinformation node modules inquirer node modules ansi regex package json dependency hierarchy cli tgz root library inquirer tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file employeeinformation package json path to vulnerable library employeeinformation node modules npm node modules cliui node modules ansi regex package json employeeinformation node modules angular compiler cli node modules ansi regex package json employeeinformation node modules npm node modules wrap ansi node modules ansi regex package json employeeinformation node modules npm node modules yargs node modules ansi regex package json dependency hierarchy compiler cli tgz root library yargs tgz string width tgz strip ansi tgz x ansi regex tgz vulnerable library found in base branch master vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex step up your open source security game with whitesource
0
59
2,518,724,462
IssuesEvent
2015-01-17 01:25:38
tinkerpop/tinkerpop3
https://api.github.com/repos/tinkerpop/tinkerpop3
closed
Traversal.getEnd() when there is no End to get.
bug process
Probably just need to return `EmptyStep`. ```groovy gremlin> g = TinkerFactory.createModern() ==>tinkergraph[vertices:6 edges:6] gremlin> g.V().repeat(__.repeat(__.out()).times(1)).times(1).values('name') -1 Display stack trace? [yN] y java.lang.ArrayIndexOutOfBoundsException: -1 at java.util.ArrayList.elementData(ArrayList.java:403) at java.util.ArrayList.get(ArrayList.java:416) at com.tinkerpop.gremlin.process.util.TraversalHelper.getEnd(TraversalHelper.java:99) at com.tinkerpop.gremlin.process.graph.step.branch.RepeatStep.addRepeatToTraversal(RepeatStep.java:175) at com.tinkerpop.gremlin.process.graph.GraphTraversal.repeat(GraphTraversal.java:516) at com.tinkerpop.gremlin.process.graph.AnonymousGraphTraversal.repeat(AnonymousGraphTraversal.java:425) at com.tinkerpop.gremlin.process.graph.AnonymousGraphTraversal$repeat.call(Unknown Source) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116) at groovysh_evaluate.run(groovysh_evaluate:3) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:68) at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:159) at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:121) at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:93) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:123) at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:83) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at com.tinkerpop.gremlin.console.Console.<init>(Console.groovy:95) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at com.tinkerpop.gremlin.console.Console.main(Console.groovy:248) gremlin> ```
1.0
Traversal.getEnd() when there is no End to get. - Probably just need to return `EmptyStep`. ```groovy gremlin> g = TinkerFactory.createModern() ==>tinkergraph[vertices:6 edges:6] gremlin> g.V().repeat(__.repeat(__.out()).times(1)).times(1).values('name') -1 Display stack trace? [yN] y java.lang.ArrayIndexOutOfBoundsException: -1 at java.util.ArrayList.elementData(ArrayList.java:403) at java.util.ArrayList.get(ArrayList.java:416) at com.tinkerpop.gremlin.process.util.TraversalHelper.getEnd(TraversalHelper.java:99) at com.tinkerpop.gremlin.process.graph.step.branch.RepeatStep.addRepeatToTraversal(RepeatStep.java:175) at com.tinkerpop.gremlin.process.graph.GraphTraversal.repeat(GraphTraversal.java:516) at com.tinkerpop.gremlin.process.graph.AnonymousGraphTraversal.repeat(AnonymousGraphTraversal.java:425) at com.tinkerpop.gremlin.process.graph.AnonymousGraphTraversal$repeat.call(Unknown Source) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116) at groovysh_evaluate.run(groovysh_evaluate:3) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:68) at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:159) at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:121) at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:93) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:123) at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90) at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324) at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150) at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:83) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at com.tinkerpop.gremlin.console.Console.<init>(Console.groovy:95) at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215) at com.tinkerpop.gremlin.console.Console.main(Console.groovy:248) gremlin> ```
process
traversal getend when there is no end to get probably just need to return emptystep groovy gremlin g tinkerfactory createmodern tinkergraph gremlin g v repeat repeat out times times values name display stack trace y java lang arrayindexoutofboundsexception at java util arraylist elementdata arraylist java at java util arraylist get arraylist java at com tinkerpop gremlin process util traversalhelper getend traversalhelper java at com tinkerpop gremlin process graph step branch repeatstep addrepeattotraversal repeatstep java at com tinkerpop gremlin process graph graphtraversal repeat graphtraversal java at com tinkerpop gremlin process graph anonymousgraphtraversal repeat anonymousgraphtraversal java at com tinkerpop gremlin process graph anonymousgraphtraversal repeat call unknown source at org codehaus groovy runtime callsite callsitearray defaultcall callsitearray java at org codehaus groovy runtime callsite abstractcallsite call abstractcallsite java at org codehaus groovy runtime callsite abstractcallsite call abstractcallsite java at groovysh evaluate run groovysh evaluate at org codehaus groovy vmplugin indyinterface selectmethod indyinterface java at org codehaus groovy tools shell interpreter evaluate interpreter groovy at org codehaus groovy tools shell groovysh execute groovysh groovy at org codehaus groovy tools shell shell leftshift shell groovy at org codehaus groovy tools shell shellrunner work shellrunner groovy at org codehaus groovy tools shell interactiveshellrunner super work interactiveshellrunner groovy at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy reflection cachedmethod invoke cachedmethod java at groovy lang metamethod domethodinvoke metamethod java at groovy lang metaclassimpl invokemethod metaclassimpl java at org codehaus groovy runtime scriptbytecodeadapter invokemethodonsupern scriptbytecodeadapter java at org codehaus groovy runtime scriptbytecodeadapter scriptbytecodeadapter java at org codehaus groovy tools shell interactiveshellrunner work interactiveshellrunner groovy at org codehaus groovy tools shell shellrunner run shellrunner groovy at org codehaus groovy tools shell interactiveshellrunner super run interactiveshellrunner groovy at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy reflection cachedmethod invoke cachedmethod java at groovy lang metamethod domethodinvoke metamethod java at groovy lang metaclassimpl invokemethod metaclassimpl java at org codehaus groovy runtime scriptbytecodeadapter invokemethodonsupern scriptbytecodeadapter java at org codehaus groovy runtime scriptbytecodeadapter scriptbytecodeadapter java at org codehaus groovy tools shell interactiveshellrunner run interactiveshellrunner groovy at org codehaus groovy vmplugin indyinterface selectmethod indyinterface java at com tinkerpop gremlin console console console groovy at org codehaus groovy vmplugin indyinterface selectmethod indyinterface java at com tinkerpop gremlin console console main console groovy gremlin
1
202,302
7,046,488,610
IssuesEvent
2018-01-02 08:17:53
intel-analytics/BigDL
https://api.github.com/repos/intel-analytics/BigDL
opened
Model Quantize for mobilenet ssd cannot generate correct results
medium priority
Model Quantize for mobilenet ssd cannot generate correct results, map drops to 0
1.0
Model Quantize for mobilenet ssd cannot generate correct results - Model Quantize for mobilenet ssd cannot generate correct results, map drops to 0
non_process
model quantize for mobilenet ssd cannot generate correct results model quantize for mobilenet ssd cannot generate correct results map drops to
0
20,603
27,268,001,733
IssuesEvent
2023-02-22 19:42:29
cse442-at-ub/project_s23-cinco
https://api.github.com/repos/cse442-at-ub/project_s23-cinco
opened
Convert the homepage and login page into a single page, each being a single frame for seamless transition between the two frames
Processing Task Sprint 1
*Task Tests* test 1: - make sure you are under the "Website" page and there are two separate frames for the homepage and login page, such as: ![figma-homepage-and-loginpage-as-frame.PNG](https://images.zenhubusercontent.com/63e2d05608ee4c45e81f66f0/5119a849-21b6-43da-89c1-584c38c7ac3e) test 2: - check if the login button on the homepage is linked to the login page such as:![figma-login-button-connected-to-loginpage.PNG](https://images.zenhubusercontent.com/63e2d05608ee4c45e81f66f0/e16490da-337a-409f-816e-a5c469a364b2)
1.0
Convert the homepage and login page into a single page, each being a single frame for seamless transition between the two frames - *Task Tests* test 1: - make sure you are under the "Website" page and there are two separate frames for the homepage and login page, such as: ![figma-homepage-and-loginpage-as-frame.PNG](https://images.zenhubusercontent.com/63e2d05608ee4c45e81f66f0/5119a849-21b6-43da-89c1-584c38c7ac3e) test 2: - check if the login button on the homepage is linked to the login page such as:![figma-login-button-connected-to-loginpage.PNG](https://images.zenhubusercontent.com/63e2d05608ee4c45e81f66f0/e16490da-337a-409f-816e-a5c469a364b2)
process
convert the homepage and login page into a single page each being a single frame for seamless transition between the two frames task tests test make sure you are under the website page and there are two separate frames for the homepage and login page such as test check if the login button on the homepage is linked to the login page such as
1
18,394
24,532,003,504
IssuesEvent
2022-10-11 17:14:43
gitpod-io/gitpod
https://api.github.com/repos/gitpod-io/gitpod
closed
Allow dynamic workspace location configuration in .gitpod.yml
type: feature request user experience feature: gitpod yml team: IDE aspect: gitpod loading process
## Is your feature request related to a problem? Please describe <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> I need a .gitpod.yml file that can dynamically reads the repository name & sets the workspace location to that name. The reason is that I am using a VSCode extension that only shows the UI when there exists a specific file in the repository. However, the default workspace location is `/workspace`, which is not what I want. What I want is: `/workspace/<foobar>`. At the moment, there is no way we can use any environment variables for the `workspaceLocation`. I can only use hard-coded string, which is frustrating. ## Describe the behaviour you'd like <!-- A clear and concise description of what you want to happen. --> Users can set their environment variables or use the pre-defined ones like `$GITPOD_REPO_ROOT` to setup the `workspaceLocation`. With this feature, the `.gitpod.yml` file can be reused in different projects without modifying anything. ## Describe alternatives you've considered <!-- A clear and concise description of any alternative solutions or features you've considered. --> ## Additional context <!-- Add any other context or screenshots about the feature request here. -->
1.0
Allow dynamic workspace location configuration in .gitpod.yml - ## Is your feature request related to a problem? Please describe <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> I need a .gitpod.yml file that can dynamically reads the repository name & sets the workspace location to that name. The reason is that I am using a VSCode extension that only shows the UI when there exists a specific file in the repository. However, the default workspace location is `/workspace`, which is not what I want. What I want is: `/workspace/<foobar>`. At the moment, there is no way we can use any environment variables for the `workspaceLocation`. I can only use hard-coded string, which is frustrating. ## Describe the behaviour you'd like <!-- A clear and concise description of what you want to happen. --> Users can set their environment variables or use the pre-defined ones like `$GITPOD_REPO_ROOT` to setup the `workspaceLocation`. With this feature, the `.gitpod.yml` file can be reused in different projects without modifying anything. ## Describe alternatives you've considered <!-- A clear and concise description of any alternative solutions or features you've considered. --> ## Additional context <!-- Add any other context or screenshots about the feature request here. -->
process
allow dynamic workspace location configuration in gitpod yml is your feature request related to a problem please describe i need a gitpod yml file that can dynamically reads the repository name sets the workspace location to that name the reason is that i am using a vscode extension that only shows the ui when there exists a specific file in the repository however the default workspace location is workspace which is not what i want what i want is workspace at the moment there is no way we can use any environment variables for the workspacelocation i can only use hard coded string which is frustrating describe the behaviour you d like users can set their environment variables or use the pre defined ones like gitpod repo root to setup the workspacelocation with this feature the gitpod yml file can be reused in different projects without modifying anything describe alternatives you ve considered additional context
1
4,213
7,177,042,787
IssuesEvent
2018-01-31 12:13:42
zotero/zotero
https://api.github.com/repos/zotero/zotero
closed
Delayed citations: "c is undefined" when choosing item in Quick Format
Regression Word Processor Integration
Mac Word 16.9.1 1. Add/Edit Citation 2. Choose IEEE or Nature style 3. Search for an item, down arrow to focus it, and press Return. ``` Error: c is undefined Source File: chrome://zotero/content/xpcom/citeproc.js Line: 4932 ``` This happens before pressing Return again to close the Quick Format bar. Doesn't happen with, e.g., APA.
1.0
Delayed citations: "c is undefined" when choosing item in Quick Format - Mac Word 16.9.1 1. Add/Edit Citation 2. Choose IEEE or Nature style 3. Search for an item, down arrow to focus it, and press Return. ``` Error: c is undefined Source File: chrome://zotero/content/xpcom/citeproc.js Line: 4932 ``` This happens before pressing Return again to close the Quick Format bar. Doesn't happen with, e.g., APA.
process
delayed citations c is undefined when choosing item in quick format mac word add edit citation choose ieee or nature style search for an item down arrow to focus it and press return error c is undefined source file chrome zotero content xpcom citeproc js line this happens before pressing return again to close the quick format bar doesn t happen with e g apa
1
17,626
23,443,679,186
IssuesEvent
2022-08-15 17:20:51
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
frombuffer() returns oversized tensors when multiprocessing
module: multiprocessing triaged
### 🐛 Describe the bug Hi, I am trying to use torch.frombuffer() using a shared_memory buffer with multiprocessing. The function seems to work fine in the main process, but allocates oversized tensors when invoked in the child process: ``` import torch import multiprocessing as mp from multiprocessing import shared_memory def printShape(shm): tensor = torch.frombuffer(shm.buf, dtype = torch.float32) print(tensor) print(tensor.shape) if __name__ == '__main__': test_array = torch.tensor([1, 2, 3], dtype = torch.float32) test_shm = shared_memory.SharedMemory(create = True, size = test_array.element_size() * test_array.numel()) torch.frombuffer(test_shm.buf, dtype = torch.float32).copy_(test_array) printShape(test_shm) p = mp.Process(target = printShape, args = (test_shm,)) p.start() p.join() # Output: # tensor([1., 2., 3.]) # torch.Size([3]) # tensor([1., 2., 3., ..., 0., 0., 0.]) # torch.Size([1024]) ``` This only happens with tensors smaller than a certain size (1024?), large tensors have correct sizes. ### Versions PyTorch version: 1.12.0+cpu Is debug build: False CUDA used to build PyTorch: None ROCM used to build PyTorch: N/A OS: Microsoft Windows 10 Enterprise GCC version: (x86_64-posix-seh, Built by strawberryperl.com project) 8.3.0 Clang version: Could not collect CMake version: Could not collect Libc version: N/A Python version: 3.10.5 (tags/v3.10.5:f377153, Jun 6 2022, 16:14:13) [MSC v.1929 64 bit (AMD64)] (64-bit runtime) Python platform: Windows-10-10.0.19042-SP0 Is CUDA available: False CUDA runtime version: No CUDA GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA HIP runtime version: N/A MIOpen runtime version: N/A Is XNNPACK available: True Versions of relevant libraries: [pip3] numpy==1.23.0 [pip3] torch==1.12.0 [pip3] torchaudio==0.12.0 [pip3] torchvision==0.13.0 [conda] Could not collect cc @VitalyFedyunin
1.0
frombuffer() returns oversized tensors when multiprocessing - ### 🐛 Describe the bug Hi, I am trying to use torch.frombuffer() using a shared_memory buffer with multiprocessing. The function seems to work fine in the main process, but allocates oversized tensors when invoked in the child process: ``` import torch import multiprocessing as mp from multiprocessing import shared_memory def printShape(shm): tensor = torch.frombuffer(shm.buf, dtype = torch.float32) print(tensor) print(tensor.shape) if __name__ == '__main__': test_array = torch.tensor([1, 2, 3], dtype = torch.float32) test_shm = shared_memory.SharedMemory(create = True, size = test_array.element_size() * test_array.numel()) torch.frombuffer(test_shm.buf, dtype = torch.float32).copy_(test_array) printShape(test_shm) p = mp.Process(target = printShape, args = (test_shm,)) p.start() p.join() # Output: # tensor([1., 2., 3.]) # torch.Size([3]) # tensor([1., 2., 3., ..., 0., 0., 0.]) # torch.Size([1024]) ``` This only happens with tensors smaller than a certain size (1024?), large tensors have correct sizes. ### Versions PyTorch version: 1.12.0+cpu Is debug build: False CUDA used to build PyTorch: None ROCM used to build PyTorch: N/A OS: Microsoft Windows 10 Enterprise GCC version: (x86_64-posix-seh, Built by strawberryperl.com project) 8.3.0 Clang version: Could not collect CMake version: Could not collect Libc version: N/A Python version: 3.10.5 (tags/v3.10.5:f377153, Jun 6 2022, 16:14:13) [MSC v.1929 64 bit (AMD64)] (64-bit runtime) Python platform: Windows-10-10.0.19042-SP0 Is CUDA available: False CUDA runtime version: No CUDA GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA HIP runtime version: N/A MIOpen runtime version: N/A Is XNNPACK available: True Versions of relevant libraries: [pip3] numpy==1.23.0 [pip3] torch==1.12.0 [pip3] torchaudio==0.12.0 [pip3] torchvision==0.13.0 [conda] Could not collect cc @VitalyFedyunin
process
frombuffer returns oversized tensors when multiprocessing 🐛 describe the bug hi i am trying to use torch frombuffer using a shared memory buffer with multiprocessing the function seems to work fine in the main process but allocates oversized tensors when invoked in the child process import torch import multiprocessing as mp from multiprocessing import shared memory def printshape shm tensor torch frombuffer shm buf dtype torch print tensor print tensor shape if name main test array torch tensor dtype torch test shm shared memory sharedmemory create true size test array element size test array numel torch frombuffer test shm buf dtype torch copy test array printshape test shm p mp process target printshape args test shm p start p join output tensor torch size tensor torch size this only happens with tensors smaller than a certain size large tensors have correct sizes versions pytorch version cpu is debug build false cuda used to build pytorch none rocm used to build pytorch n a os microsoft windows enterprise gcc version posix seh built by strawberryperl com project clang version could not collect cmake version could not collect libc version n a python version tags jun bit runtime python platform windows is cuda available false cuda runtime version no cuda gpu models and configuration no cuda nvidia driver version no cuda cudnn version no cuda hip runtime version n a miopen runtime version n a is xnnpack available true versions of relevant libraries numpy torch torchaudio torchvision could not collect cc vitalyfedyunin
1
19,347
25,479,198,534
IssuesEvent
2022-11-25 17:56:09
googleapis/python-source-context
https://api.github.com/repos/googleapis/python-source-context
closed
Your .repo-metadata.json file has a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'source' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'source' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname source invalid in repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
3,898
2,711,461,924
IssuesEvent
2015-04-09 06:28:37
PlayWithMagic/PlayWithMagic
https://api.github.com/repos/PlayWithMagic/PlayWithMagic
closed
Video of Routine
Design feature Icon Control Routine Upload Image/Video
We would need to worry about -- Video isn't spam -- Video is of routine -- Video is not inappropriate (though YouTube will take care of this for us) -- Reporting posted videos -- How do we handle multiple videos? -- Voting on quality of video?
1.0
Video of Routine - We would need to worry about -- Video isn't spam -- Video is of routine -- Video is not inappropriate (though YouTube will take care of this for us) -- Reporting posted videos -- How do we handle multiple videos? -- Voting on quality of video?
non_process
video of routine we would need to worry about video isn t spam video is of routine video is not inappropriate though youtube will take care of this for us reporting posted videos how do we handle multiple videos voting on quality of video
0
286
2,725,175,902
IssuesEvent
2015-04-14 22:03:52
hammerlab/pileup.js
https://api.github.com/repos/hammerlab/pileup.js
opened
Make a unified "watch" task
process
There's `grunt browserify:watchTest` and `grunt browserify:watchDist` and you have to pick one of them (or use two terminals). It would be nice if there were a simple "watch" task.
1.0
Make a unified "watch" task - There's `grunt browserify:watchTest` and `grunt browserify:watchDist` and you have to pick one of them (or use two terminals). It would be nice if there were a simple "watch" task.
process
make a unified watch task there s grunt browserify watchtest and grunt browserify watchdist and you have to pick one of them or use two terminals it would be nice if there were a simple watch task
1
1,586
4,178,050,139
IssuesEvent
2016-06-22 04:08:24
willdwyer/bcbsmaissuestracker
https://api.github.com/repos/willdwyer/bcbsmaissuestracker
closed
HighRoads Certificate Renewal
Component-UI Environment-Production Priority-ShowStopper Status- In-Process To_Be_Addressed_In_V1 Type-Task
Hello, The certificate that is used in Production to support SSO is expiring on June 27th. Can you please open a request with your support team to deploy the updated certificate? We would like to schedule this for either night of: Thursday, June 16th or Tuesday, June 21st (9 PM either day). Please let us know which night works for you.
1.0
HighRoads Certificate Renewal - Hello, The certificate that is used in Production to support SSO is expiring on June 27th. Can you please open a request with your support team to deploy the updated certificate? We would like to schedule this for either night of: Thursday, June 16th or Tuesday, June 21st (9 PM either day). Please let us know which night works for you.
process
highroads certificate renewal hello the certificate that is used in production to support sso is expiring on june can you please open a request with your support team to deploy the updated certificate we would like to schedule this for either night of thursday june or tuesday june pm either day please let us know which night works for you
1
11,375
3,487,916,726
IssuesEvent
2016-01-02 12:39:45
flycheck/flycheck
https://api.github.com/repos/flycheck/flycheck
closed
Code of conduct
documentation idea in progress
Before we open and announce a Gitter channel for Flycheck (see #796), I'd like to setup a code of conduct for Flycheck that defines what we consider acceptable and desired behaviour within our community and within our communication channels. Since I do not have a lot of experience with this topic, and since I'd not like to re-invent the wheel, I think it's best to copy from a successful community. Specifically, I've read [Rust's code of conduct](https://www.rust-lang.org/conduct.html). I like it because it's quite elaborate on acceptable communication; specifically it forbids insults and setups a reasonable policy on banning users. We didn't have any incidents yet, but better safe than sorry :)
1.0
Code of conduct - Before we open and announce a Gitter channel for Flycheck (see #796), I'd like to setup a code of conduct for Flycheck that defines what we consider acceptable and desired behaviour within our community and within our communication channels. Since I do not have a lot of experience with this topic, and since I'd not like to re-invent the wheel, I think it's best to copy from a successful community. Specifically, I've read [Rust's code of conduct](https://www.rust-lang.org/conduct.html). I like it because it's quite elaborate on acceptable communication; specifically it forbids insults and setups a reasonable policy on banning users. We didn't have any incidents yet, but better safe than sorry :)
non_process
code of conduct before we open and announce a gitter channel for flycheck see i d like to setup a code of conduct for flycheck that defines what we consider acceptable and desired behaviour within our community and within our communication channels since i do not have a lot of experience with this topic and since i d not like to re invent the wheel i think it s best to copy from a successful community specifically i ve read i like it because it s quite elaborate on acceptable communication specifically it forbids insults and setups a reasonable policy on banning users we didn t have any incidents yet but better safe than sorry
0
3,719
6,732,880,114
IssuesEvent
2017-10-18 13:10:14
lockedata/rcms
https://api.github.com/repos/lockedata/rcms
opened
Build agenda
conference team osem processes
## Detailed task - Create a schedule over multiple rooms (and days if required) - Publish agenda ## Assessing the task Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks. Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback. ## Extra Info - Site: [osem](https://intense-shore-93790.herokuapp.com/) - System documentation: [osem docs](http://osem.io/) - Role: Conference team - Area: Processes
1.0
Build agenda - ## Detailed task - Create a schedule over multiple rooms (and days if required) - Publish agenda ## Assessing the task Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks. Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback. ## Extra Info - Site: [osem](https://intense-shore-93790.herokuapp.com/) - System documentation: [osem docs](http://osem.io/) - Role: Conference team - Area: Processes
process
build agenda detailed task create a schedule over multiple rooms and days if required publish agenda assessing the task try to perform the task use google and the system documentation to help part of what we re trying to assess how easy it is for people to work out how to do tasks use a 👍 reaction to this task if you were able to perform the task use a 👎 reaction to the task if you could not complete it add a reply with any comments or feedback extra info site system documentation role conference team area processes
1
10,930
13,733,313,441
IssuesEvent
2020-10-05 06:52:37
symfony/symfony
https://api.github.com/repos/symfony/symfony
closed
symfony/process documentation does not mention it is included in symfony/symfony
Bug Process Status: Needs Review
**Symfony version(s) affected**: 3.4.0 **Description** The documentation does not make clear that the component is part of the symfony/symfony bundle. **How to reproduce** Create a package and use symfony/processs:^3.4 as a dependency. The package build works fine until you try and use it in a 3.4 project, you cannot get symfony/project to install. I spent a few hours chasing this problem because of other errors thinking it was a composer issue until I noticed this: ``` bash $ composer depends symfony/process my-project/proj my-test-branch requires symfony/process (^3.4) symfony/symfony v3.4.0 replaces symfony/process (self.version) ``` symfony/symfony replaces the independent bundle. If I had known this by looking at the documentation I could have been a little less grey haired! **Possible Solution** Please add a comment towards the top of the Process documentation that states that this package is actually part of the symfony/symfony package and does not not need to be included as a separate package if you are using this. In fact why is this actually listed as a separate package anyway? Either it's a component of Symfony or it's not! Making it both is confusing and wastes a lot of development time when you get it wrong.
1.0
symfony/process documentation does not mention it is included in symfony/symfony - **Symfony version(s) affected**: 3.4.0 **Description** The documentation does not make clear that the component is part of the symfony/symfony bundle. **How to reproduce** Create a package and use symfony/processs:^3.4 as a dependency. The package build works fine until you try and use it in a 3.4 project, you cannot get symfony/project to install. I spent a few hours chasing this problem because of other errors thinking it was a composer issue until I noticed this: ``` bash $ composer depends symfony/process my-project/proj my-test-branch requires symfony/process (^3.4) symfony/symfony v3.4.0 replaces symfony/process (self.version) ``` symfony/symfony replaces the independent bundle. If I had known this by looking at the documentation I could have been a little less grey haired! **Possible Solution** Please add a comment towards the top of the Process documentation that states that this package is actually part of the symfony/symfony package and does not not need to be included as a separate package if you are using this. In fact why is this actually listed as a separate package anyway? Either it's a component of Symfony or it's not! Making it both is confusing and wastes a lot of development time when you get it wrong.
process
symfony process documentation does not mention it is included in symfony symfony symfony version s affected description the documentation does not make clear that the component is part of the symfony symfony bundle how to reproduce create a package and use symfony processs as a dependency the package build works fine until you try and use it in a project you cannot get symfony project to install i spent a few hours chasing this problem because of other errors thinking it was a composer issue until i noticed this bash composer depends symfony process my project proj my test branch requires symfony process symfony symfony replaces symfony process self version symfony symfony replaces the independent bundle if i had known this by looking at the documentation i could have been a little less grey haired possible solution please add a comment towards the top of the process documentation that states that this package is actually part of the symfony symfony package and does not not need to be included as a separate package if you are using this in fact why is this actually listed as a separate package anyway either it s a component of symfony or it s not making it both is confusing and wastes a lot of development time when you get it wrong
1
1,271
3,800,630,903
IssuesEvent
2016-03-23 19:48:14
mapbox/mapbox-gl-js
https://api.github.com/repos/mapbox/mapbox-gl-js
closed
Replace prova with another test runner
meta testing & release process
Prova was unpublished from npm just now :skull: Unless this is a temporary glitch, we'll need to find a long-term replacement for ~~npm~~ Prova - https://www.npmjs.com/package/prova - https://github.com/azer/prova
1.0
Replace prova with another test runner - Prova was unpublished from npm just now :skull: Unless this is a temporary glitch, we'll need to find a long-term replacement for ~~npm~~ Prova - https://www.npmjs.com/package/prova - https://github.com/azer/prova
process
replace prova with another test runner prova was unpublished from npm just now skull unless this is a temporary glitch we ll need to find a long term replacement for npm prova
1
3,959
6,893,597,959
IssuesEvent
2017-11-23 05:17:28
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
System.Diagnostics.Process.StartTime can be improved in Windows
area-System.Diagnostics.Process enhancement
When Process instance is created using Process.GetProcesses() , it calls GetProcessInfos method inside. In WIndows it calls to internal [NtQuerySystemInformation](https://github.com/dotnet/corefx/blob/master/src/System.Diagnostics.Process/src/System/Diagnostics/ProcessManager.Win32.cs) which returns process creation time. [ProcessInfo](https://github.com/dotnet/corefx/blob/master/src/System.Diagnostics.Process/src/System/Diagnostics/ProcessInfo.cs) can be initialized using this value and pass it into Process. This allows getting StartTime property without any exception thrown.
1.0
System.Diagnostics.Process.StartTime can be improved in Windows - When Process instance is created using Process.GetProcesses() , it calls GetProcessInfos method inside. In WIndows it calls to internal [NtQuerySystemInformation](https://github.com/dotnet/corefx/blob/master/src/System.Diagnostics.Process/src/System/Diagnostics/ProcessManager.Win32.cs) which returns process creation time. [ProcessInfo](https://github.com/dotnet/corefx/blob/master/src/System.Diagnostics.Process/src/System/Diagnostics/ProcessInfo.cs) can be initialized using this value and pass it into Process. This allows getting StartTime property without any exception thrown.
process
system diagnostics process starttime can be improved in windows when process instance is created using process getprocesses it calls getprocessinfos method inside in windows it calls to internal which returns process creation time can be initialized using this value and pass it into process this allows getting starttime property without any exception thrown
1
31,454
8,698,457,053
IssuesEvent
2018-12-04 23:31:21
apache/incubator-mxnet
https://api.github.com/repos/apache/incubator-mxnet
closed
cpp_package instructions need clarification and examples need repair
Breaking Build C++ Example
I tried following the [instructions for building c++ examples](https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-examples-in-examples-folder). The resulting binaries don't work out of the box, and after troubleshooting some terminate abruptly, core dump or do nothing. ``` ./alexnet: error while loading shared libraries: libmxnet.so: cannot open shared object file: No such file or directory ``` I worked on the example C++ for image prediction (#12397) that has its own Makefile that points to libmxnet.so specifically, and was able to get it to run without hunting for the mxnet library. The Makefile for the `cpp-package/example` files does not mention the library at all. Maybe that's a problem? The steps in the README could use some clarification. Eventually after trying a bunch of things, I found a reference that led me to try this: ``` export LD_LIBRARY_PATH=~/incubator-mxnet/lib ``` And some of the examples started working! At least they're finding the library. I can add this line to the instructions if this is what we should recommend that people do after building from source with the cpp package flag turned on. Is it? But, then there's also the fact that several the examples core dump: ``` ./mlp_gpu [21:13:44] src/io/iter_mnist.cc:110: MNISTIter: load 60000 images, shuffle=1, shape=(100,784) [21:13:44] src/io/iter_mnist.cc:110: MNISTIter: load 10000 images, shuffle=1, shape=(100,784) terminate called after throwing an instance of 'dmlc::Error' what(): [21:13:44] ../include/mxnet-cpp/ndarray.hpp:54: Check failed: MXNDArrayCreate(shape.data(), shape.size(), context.GetDeviceType(), context.GetDeviceId(), delay_alloc, &handle) == 0 (-1 vs. 0) ``` ``` ./inception_bn terminate called after throwing an instance of 'dmlc::Error' what(): [21:15:16] ../include/mxnet-cpp/symbol.hpp:219: Check failed: MXSymbolInferShape(GetHandle(), keys.size(), keys.data(), arg_ind_ptr.data(), arg_shape_data.data(), &in_shape_size, &in_shape_ndim, &in_shape_data, &out_shape_size, &out_shape_ndim, &out_shape_data, &aux_shape_size, &aux_shape_ndim, &aux_shape_data, &complete) == 0 (-1 vs. 0) ``` ``` ./test_score terminate called after throwing an instance of 'std::logic_error' what(): basic_string::_M_construct null not valid Aborted (core dumped) ``` Or do nothing: ``` ./test_optimizer ```
1.0
cpp_package instructions need clarification and examples need repair - I tried following the [instructions for building c++ examples](https://github.com/apache/incubator-mxnet/tree/master/cpp-package#building-c-examples-in-examples-folder). The resulting binaries don't work out of the box, and after troubleshooting some terminate abruptly, core dump or do nothing. ``` ./alexnet: error while loading shared libraries: libmxnet.so: cannot open shared object file: No such file or directory ``` I worked on the example C++ for image prediction (#12397) that has its own Makefile that points to libmxnet.so specifically, and was able to get it to run without hunting for the mxnet library. The Makefile for the `cpp-package/example` files does not mention the library at all. Maybe that's a problem? The steps in the README could use some clarification. Eventually after trying a bunch of things, I found a reference that led me to try this: ``` export LD_LIBRARY_PATH=~/incubator-mxnet/lib ``` And some of the examples started working! At least they're finding the library. I can add this line to the instructions if this is what we should recommend that people do after building from source with the cpp package flag turned on. Is it? But, then there's also the fact that several the examples core dump: ``` ./mlp_gpu [21:13:44] src/io/iter_mnist.cc:110: MNISTIter: load 60000 images, shuffle=1, shape=(100,784) [21:13:44] src/io/iter_mnist.cc:110: MNISTIter: load 10000 images, shuffle=1, shape=(100,784) terminate called after throwing an instance of 'dmlc::Error' what(): [21:13:44] ../include/mxnet-cpp/ndarray.hpp:54: Check failed: MXNDArrayCreate(shape.data(), shape.size(), context.GetDeviceType(), context.GetDeviceId(), delay_alloc, &handle) == 0 (-1 vs. 0) ``` ``` ./inception_bn terminate called after throwing an instance of 'dmlc::Error' what(): [21:15:16] ../include/mxnet-cpp/symbol.hpp:219: Check failed: MXSymbolInferShape(GetHandle(), keys.size(), keys.data(), arg_ind_ptr.data(), arg_shape_data.data(), &in_shape_size, &in_shape_ndim, &in_shape_data, &out_shape_size, &out_shape_ndim, &out_shape_data, &aux_shape_size, &aux_shape_ndim, &aux_shape_data, &complete) == 0 (-1 vs. 0) ``` ``` ./test_score terminate called after throwing an instance of 'std::logic_error' what(): basic_string::_M_construct null not valid Aborted (core dumped) ``` Or do nothing: ``` ./test_optimizer ```
non_process
cpp package instructions need clarification and examples need repair i tried following the the resulting binaries don t work out of the box and after troubleshooting some terminate abruptly core dump or do nothing alexnet error while loading shared libraries libmxnet so cannot open shared object file no such file or directory i worked on the example c for image prediction that has its own makefile that points to libmxnet so specifically and was able to get it to run without hunting for the mxnet library the makefile for the cpp package example files does not mention the library at all maybe that s a problem the steps in the readme could use some clarification eventually after trying a bunch of things i found a reference that led me to try this export ld library path incubator mxnet lib and some of the examples started working at least they re finding the library i can add this line to the instructions if this is what we should recommend that people do after building from source with the cpp package flag turned on is it but then there s also the fact that several the examples core dump mlp gpu src io iter mnist cc mnistiter load images shuffle shape src io iter mnist cc mnistiter load images shuffle shape terminate called after throwing an instance of dmlc error what include mxnet cpp ndarray hpp check failed mxndarraycreate shape data shape size context getdevicetype context getdeviceid delay alloc handle vs inception bn terminate called after throwing an instance of dmlc error what include mxnet cpp symbol hpp check failed mxsymbolinfershape gethandle keys size keys data arg ind ptr data arg shape data data in shape size in shape ndim in shape data out shape size out shape ndim out shape data aux shape size aux shape ndim aux shape data complete vs test score terminate called after throwing an instance of std logic error what basic string m construct null not valid aborted core dumped or do nothing test optimizer
0
76,960
9,975,223,378
IssuesEvent
2019-07-09 12:36:40
JDASoftwareGroup/kartothek
https://api.github.com/repos/JDASoftwareGroup/kartothek
closed
Add documentation about store factories
documentation good first issue
We heavily rely on the usage of store factories to not (attempt to) serialize connections. We should add a section to the documentation explaining the reasoning, introduce examples and establish best practices (e.g. don't use lambdas)
1.0
Add documentation about store factories - We heavily rely on the usage of store factories to not (attempt to) serialize connections. We should add a section to the documentation explaining the reasoning, introduce examples and establish best practices (e.g. don't use lambdas)
non_process
add documentation about store factories we heavily rely on the usage of store factories to not attempt to serialize connections we should add a section to the documentation explaining the reasoning introduce examples and establish best practices e g don t use lambdas
0
120,987
10,145,397,513
IssuesEvent
2019-08-05 04:05:24
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
teamcity: failed test: TestNot
C-test-failure O-robot
The following tests appear to have failed on master (test): TestNot/1011101010111110111110101100111011001010111111101011101010111110 You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestNot). [#1421766](https://teamcity.cockroachdb.com/viewLog.html?buildId=1421766): ``` TestNot/1011101010111110111110101100111011001010111111101011101010111110 --- FAIL: test/TestNot/1011101010111110111110101100111011001010111111101011101010111110 (0.000s) Test ended in panic. ``` Please assign, take a look and update the issue accordingly.
1.0
teamcity: failed test: TestNot - The following tests appear to have failed on master (test): TestNot/1011101010111110111110101100111011001010111111101011101010111110 You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestNot). [#1421766](https://teamcity.cockroachdb.com/viewLog.html?buildId=1421766): ``` TestNot/1011101010111110111110101100111011001010111111101011101010111110 --- FAIL: test/TestNot/1011101010111110111110101100111011001010111111101011101010111110 (0.000s) Test ended in panic. ``` Please assign, take a look and update the issue accordingly.
non_process
teamcity failed test testnot the following tests appear to have failed on master test testnot you may want to check testnot fail test testnot test ended in panic please assign take a look and update the issue accordingly
0
243,505
7,858,706,530
IssuesEvent
2018-06-21 14:36:57
larsiusprime/tdrpg-bugs
https://api.github.com/repos/larsiusprime/tdrpg-bugs
closed
ET dying doesn't trigger the ending, waits forever
1 Battle DQ1 Priority HIGH in progress
2.0.8c - steam, win7sp1x64, HD mode@1704x960windowed ![dq-zelemir-waiting-forever](https://cloud.githubusercontent.com/assets/16719964/17647936/9456e814-6205-11e6-9383-cd8fad204dad.png) The game has been continuing several minutes at 16X. Everyone's animating, Zelemir is occasionally making everyone have a Frenzy icon, but nothing more is happening. I had just beaten ET by getting his last bearded self down to 0 HP, and he exploded into psi stars IIRC.
1.0
ET dying doesn't trigger the ending, waits forever - 2.0.8c - steam, win7sp1x64, HD mode@1704x960windowed ![dq-zelemir-waiting-forever](https://cloud.githubusercontent.com/assets/16719964/17647936/9456e814-6205-11e6-9383-cd8fad204dad.png) The game has been continuing several minutes at 16X. Everyone's animating, Zelemir is occasionally making everyone have a Frenzy icon, but nothing more is happening. I had just beaten ET by getting his last bearded self down to 0 HP, and he exploded into psi stars IIRC.
non_process
et dying doesn t trigger the ending waits forever steam hd mode the game has been continuing several minutes at everyone s animating zelemir is occasionally making everyone have a frenzy icon but nothing more is happening i had just beaten et by getting his last bearded self down to hp and he exploded into psi stars iirc
0
194,064
15,396,202,350
IssuesEvent
2021-03-03 20:18:15
enthought/enable
https://api.github.com/repos/enthought/enable
closed
Place Kiva docs before Enable docs?
component: documentation discussion
Since `Kiva` is the lower level library, it might make sense to have the documentation set up so that the kiva docs appear first in the table of contents and then the `Enable` docs follow (enable builds on kiva so good to understand kiva first?). I could however imagine arguments for leaving it as is. This is a very minor issue, but trivial to fix if we decide to do so.
1.0
Place Kiva docs before Enable docs? - Since `Kiva` is the lower level library, it might make sense to have the documentation set up so that the kiva docs appear first in the table of contents and then the `Enable` docs follow (enable builds on kiva so good to understand kiva first?). I could however imagine arguments for leaving it as is. This is a very minor issue, but trivial to fix if we decide to do so.
non_process
place kiva docs before enable docs since kiva is the lower level library it might make sense to have the documentation set up so that the kiva docs appear first in the table of contents and then the enable docs follow enable builds on kiva so good to understand kiva first i could however imagine arguments for leaving it as is this is a very minor issue but trivial to fix if we decide to do so
0
21,235
28,350,535,686
IssuesEvent
2023-04-12 02:00:07
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Wed, 12 Apr 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events ### EvAC3D: From Event-based Apparent Contours to 3D Models via Continuous Visual Hulls - **Authors:** Ziyun Wang, Kenneth Chaney, Kostas Daniilidis - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05296 - **Pdf link:** https://arxiv.org/pdf/2304.05296 - **Abstract** 3D reconstruction from multiple views is a successful computer vision field with multiple deployments in applications. State of the art is based on traditional RGB frames that enable optimization of photo-consistency cross views. In this paper, we study the problem of 3D reconstruction from event-cameras, motivated by the advantages of event-based cameras in terms of low power and latency as well as by the biological evidence that eyes in nature capture the same data and still perceive well 3D shape. The foundation of our hypothesis that 3D reconstruction is feasible using events lies in the information contained in the occluding contours and in the continuous scene acquisition with events. We propose Apparent Contour Events (ACE), a novel event-based representation that defines the geometry of the apparent contour of an object. We represent ACE by a spatially and temporally continuous implicit function defined in the event x-y-t space. Furthermore, we design a novel continuous Voxel Carving algorithm enabled by the high temporal resolution of the Apparent Contour Events. To evaluate the performance of the method, we collect MOEC-3D, a 3D event dataset of a set of common real-world objects. We demonstrate the ability of EvAC3D to reconstruct high-fidelity mesh surfaces from real event sequences while allowing the refinement of the 3D reconstruction for each individual event. ### ELVIS: Empowering Locality of Vision Language Pre-training with Intra-modal Similarity - **Authors:** Sumin Seo, JaeWoong Shin, Jaewoo Kang, Tae Soo Kim, Thijs Kooi - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Computation and Language (cs.CL) - **Arxiv link:** https://arxiv.org/abs/2304.05303 - **Pdf link:** https://arxiv.org/pdf/2304.05303 - **Abstract** Deep learning has shown great potential in assisting radiologists in reading chest X-ray (CXR) images, but its need for expensive annotations for improving performance prevents widespread clinical application. Visual language pre-training (VLP) can alleviate the burden and cost of annotation by leveraging routinely generated reports for radiographs, which exist in large quantities as well as in paired form (imagetext pairs). Additionally, extensions to localization-aware VLPs are being proposed to address the needs of accurate localization of abnormalities for CAD in CXR. However, we find that the formulation proposed by locality-aware VLP literatures actually leads to loss in spatial relationships required for downstream localization tasks. Therefore, we propose Empowering Locality of VLP with Intra-modal Similarity, ELVIS, a VLP aware of intra-modal locality, to better preserve the locality within radiographs or reports, which enhances the ability to comprehend location references in text reports. Our locality-aware VLP method significantly outperforms state-of-the art baselines in multiple segmentation tasks and the MS-CXR phrase grounding task. Qualitatively, ELVIS is able to focus well on regions of interest described in the report text compared to prior approaches, allowing for enhanced interpretability. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Multi-Graph Convolution Network for Pose Forecasting - **Authors:** Hongwei Ren, Yuhong Shi, Kewei Liang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04956 - **Pdf link:** https://arxiv.org/pdf/2304.04956 - **Abstract** Recently, there has been a growing interest in predicting human motion, which involves forecasting future body poses based on observed pose sequences. This task is complex due to modeling spatial and temporal relationships. The most commonly used models for this task are autoregressive models, such as recurrent neural networks (RNNs) or variants, and Transformer Networks. However, RNNs have several drawbacks, such as vanishing or exploding gradients. Other researchers have attempted to solve the communication problem in the spatial dimension by integrating Graph Convolutional Networks (GCN) and Long Short-Term Memory (LSTM) models. These works deal with temporal and spatial information separately, which limits the effectiveness. To fix this problem, we propose a novel approach called the multi-graph convolution network (MGCN) for 3D human pose forecasting. This model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences. Multiple frames give multiple parts, joined together in a single graph instance. Furthermore, we also explore the influence of natural structure and sequence-aware attention to our model. In our experimental evaluation of the large-scale benchmark datasets, Human3.6M, AMSS and 3DPW, MGCN outperforms the state-of-the-art in pose prediction. ## Keyword: ISP ### Estimation of Vehicular Velocity based on Non-Intrusive stereo camera - **Authors:** Bikram Adhikari, Prabin Bhandari - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2304.05298 - **Pdf link:** https://arxiv.org/pdf/2304.05298 - **Abstract** The paper presents a modular approach for the estimation of a leading vehicle's velocity based on a non-intrusive stereo camera where SiamMask is used for leading vehicle tracking, Kernel Density estimate (KDE) is used to smooth the distance prediction from a disparity map, and LightGBM is used for leading vehicle velocity estimation. Our approach yields an RMSE of 0.416 which outperforms the baseline RMSE of 0.582 for the SUBARU Image Recognition Challenge ### Pinpointing Why Object Recognition Performance Degrades Across Income Levels and Geographies - **Authors:** Laura Gustafson, Megan Richards, Melissa Hall, Caner Hazirbas, Diane Bouchacourt, Mark Ibrahim - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05391 - **Pdf link:** https://arxiv.org/pdf/2304.05391 - **Abstract** Despite impressive advances in object-recognition, deep learning systems' performance degrades significantly across geographies and lower income levels raising pressing concerns of inequity. Addressing such performance gaps remains a challenge, as little is understood about why performance degrades across incomes or geographies. We take a step in this direction by annotating images from Dollar Street, a popular benchmark of geographically and economically diverse images, labeling each image with factors such as color, shape, and background. These annotations unlock a new granular view into how objects differ across incomes and regions. We then use these object differences to pinpoint model vulnerabilities across incomes and regions. We study a range of modern vision models, finding that performance disparities are most associated with differences in texture, occlusion, and images with darker lighting. We illustrate how insights from our factor labels can surface mitigations to improve models' performance disparities. As an example, we show that mitigating a model's vulnerability to texture can improve performance on the lower income level. We release all the factor annotations along with an interactive dashboard to facilitate research into more equitable vision systems. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### Multi-scale Fusion Fault Diagnosis Method Based on Two-Dimensionaliztion Sequence in Complex Scenarios - **Authors:** Weiyang Jin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05198 - **Pdf link:** https://arxiv.org/pdf/2304.05198 - **Abstract** Rolling bearings are critical components in rotating machinery, and their faults can cause severe damage. Early detection of abnormalities is crucial to prevent catastrophic accidents. Traditional and intelligent methods have been used to analyze time series data, but in real-life scenarios, sensor data is often noisy and cannot be accurately characterized in the time domain, leading to mode collapse in trained models. Two-dimensionalization methods such as the Gram angle field method (GAF) or interval sampling have been proposed, but they lack mathematical derivation and interpretability. This paper proposes an improved GAF combined with grayscale images for convolution scenarios. The main contributions include illustrating the feasibility of the approach in complex scenarios, widening the data set, and introducing an improved convolutional neural network method with a multi-scale feature fusion diffusion model and deep learning compression techniques for deployment in industrial scenarios. ## Keyword: RAW ### Multi-Object Tracking by Iteratively Associating Detections with Uniform Appearance for Trawl-Based Fishing Bycatch Monitoring - **Authors:** Cheng-Yen Yang, Alan Yu Shyang Tan, Melanie J. Underwood, Charlotte Bodie, Zhongyu Jiang, Steve George, Karl Warr, Jenq-Neng Hwang, Emma Jones - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04816 - **Pdf link:** https://arxiv.org/pdf/2304.04816 - **Abstract** The aim of in-trawl catch monitoring for use in fishing operations is to detect, track and classify fish targets in real-time from video footage. Information gathered could be used to release unwanted bycatch in real-time. However, traditional multi-object tracking (MOT) methods have limitations, as they are developed for tracking vehicles or pedestrians with linear motions and diverse appearances, which are different from the scenarios such as livestock monitoring. Therefore, we propose a novel MOT method, built upon an existing observation-centric tracking algorithm, by adopting a new iterative association step to significantly boost the performance of tracking targets with a uniform appearance. The iterative association module is designed as an extendable component that can be merged into most existing tracking methods. Our method offers improved performance in tracking targets with uniform appearance and outperforms state-of-the-art techniques on our underwater fish datasets as well as the MOT17 dataset, without increasing latency nor sacrificing accuracy as measured by HOTA, MOTA, and IDF1 performance metrics. ### Point-and-Shoot All-in-Focus Photo Synthesis from Smartphone Camera Pair - **Authors:** Xianrui Luo, Juewen Peng, Weiyue Zhao, Ke Xian, Hao Lu, Zhiguo Cao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04917 - **Pdf link:** https://arxiv.org/pdf/2304.04917 - **Abstract** All-in-Focus (AIF) photography is expected to be a commercial selling point for modern smartphones. Standard AIF synthesis requires manual, time-consuming operations such as focal stack compositing, which is unfriendly to ordinary people. To achieve point-and-shoot AIF photography with a smartphone, we expect that an AIF photo can be generated from one shot of the scene, instead of from multiple photos captured by the same camera. Benefiting from the multi-camera module in modern smartphones, we introduce a new task of AIF synthesis from main (wide) and ultra-wide cameras. The goal is to recover sharp details from defocused regions in the main-camera photo with the help of the ultra-wide-camera one. The camera setting poses new challenges such as parallax-induced occlusions and inconsistent color between cameras. To overcome the challenges, we introduce a predict-and-refine network to mitigate occlusions and propose dynamic frequency-domain alignment for color correction. To enable effective training and evaluation, we also build an AIF dataset with 2686 unique scenes. Each scene includes two photos captured by the main camera, one photo captured by the ultrawide camera, and a synthesized AIF photo. Results show that our solution, termed EasyAIF, can produce high-quality AIF photos and outperforms strong baselines quantitatively and qualitatively. For the first time, we demonstrate point-and-shoot AIF photo synthesis successfully from main and ultra-wide cameras. ### Multi-Graph Convolution Network for Pose Forecasting - **Authors:** Hongwei Ren, Yuhong Shi, Kewei Liang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04956 - **Pdf link:** https://arxiv.org/pdf/2304.04956 - **Abstract** Recently, there has been a growing interest in predicting human motion, which involves forecasting future body poses based on observed pose sequences. This task is complex due to modeling spatial and temporal relationships. The most commonly used models for this task are autoregressive models, such as recurrent neural networks (RNNs) or variants, and Transformer Networks. However, RNNs have several drawbacks, such as vanishing or exploding gradients. Other researchers have attempted to solve the communication problem in the spatial dimension by integrating Graph Convolutional Networks (GCN) and Long Short-Term Memory (LSTM) models. These works deal with temporal and spatial information separately, which limits the effectiveness. To fix this problem, we propose a novel approach called the multi-graph convolution network (MGCN) for 3D human pose forecasting. This model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences. Multiple frames give multiple parts, joined together in a single graph instance. Furthermore, we also explore the influence of natural structure and sequence-aware attention to our model. In our experimental evaluation of the large-scale benchmark datasets, Human3.6M, AMSS and 3DPW, MGCN outperforms the state-of-the-art in pose prediction. ### One-Shot High-Fidelity Talking-Head Synthesis with Deformable Neural Radiance Field - **Authors:** Weichuang Li, Longhao Zhang, Dong Wang, Bin Zhao, Zhigang Wang, Mulin Chen, Bang Zhang, Zhongjian Wang, Liefeng Bo, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05097 - **Pdf link:** https://arxiv.org/pdf/2304.05097 - **Abstract** Talking head generation aims to generate faces that maintain the identity information of the source image and imitate the motion of the driving image. Most pioneering methods rely primarily on 2D representations and thus will inevitably suffer from face distortion when large head rotations are encountered. Recent works instead employ explicit 3D structural representations or implicit neural rendering to improve performance under large pose changes. Nevertheless, the fidelity of identity and expression is not so desirable, especially for novel-view synthesis. In this paper, we propose HiDe-NeRF, which achieves high-fidelity and free-view talking-head synthesis. Drawing on the recently proposed Deformable Neural Radiance Fields, HiDe-NeRF represents the 3D dynamic scene into a canonical appearance field and an implicit deformation field, where the former comprises the canonical source face and the latter models the driving pose and expression. In particular, we improve fidelity from two aspects: (i) to enhance identity expressiveness, we design a generalized appearance module that leverages multi-scale volume features to preserve face shape and details; (ii) to improve expression preciseness, we propose a lightweight deformation module that explicitly decouples the pose and expression to enable precise expression modeling. Extensive experiments demonstrate that our proposed approach can generate better results than previous works. Project page: https://www.waytron.net/hidenerf/ ### Astroformer: More Data Might Not be All You Need for Classification - **Authors:** Rishit Dagli - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05350 - **Pdf link:** https://arxiv.org/pdf/2304.05350 - **Abstract** Recent advancements in areas such as natural language processing and computer vision rely on intricate and massive models that have been trained using vast amounts of unlabelled or partly labeled data and training or deploying these state-of-the-art methods to resource constraint environments has been a challenge. Galaxy morphologies are crucial to understanding the processes by which galaxies form and evolve. Efficient methods to classify galaxy morphologies are required to extract physical information from modern-day astronomy surveys. In this paper, we introduce methods to learn from less amounts of data. We propose using a hybrid transformer-convolutional architecture drawing much inspiration from the success of CoAtNet and MaxViT. Concretely, we use the transformer-convolutional hybrid with a new stack design for the network, a different way of creating a relative self-attention layer, and pair it with a careful selection of data augmentation and regularization techniques. Our approach sets a new state-of-the-art on predicting galaxy morphologies from images on the Galaxy10 DECals dataset, a science objective, which consists of 17736 labeled images achieving $94.86\%$ top-$1$ accuracy, beating the current state-of-the-art for this task by $4.62\%$. Furthermore, this approach also sets a new state-of-the-art on CIFAR-100 and Tiny ImageNet. We also find that models and training methods used for larger datasets would often not work very well in the low-data regime. Our code and models will be released at a later date before the conference. ## Keyword: raw image There is no result
2.0
New submissions for Wed, 12 Apr 23 - ## Keyword: events ### EvAC3D: From Event-based Apparent Contours to 3D Models via Continuous Visual Hulls - **Authors:** Ziyun Wang, Kenneth Chaney, Kostas Daniilidis - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05296 - **Pdf link:** https://arxiv.org/pdf/2304.05296 - **Abstract** 3D reconstruction from multiple views is a successful computer vision field with multiple deployments in applications. State of the art is based on traditional RGB frames that enable optimization of photo-consistency cross views. In this paper, we study the problem of 3D reconstruction from event-cameras, motivated by the advantages of event-based cameras in terms of low power and latency as well as by the biological evidence that eyes in nature capture the same data and still perceive well 3D shape. The foundation of our hypothesis that 3D reconstruction is feasible using events lies in the information contained in the occluding contours and in the continuous scene acquisition with events. We propose Apparent Contour Events (ACE), a novel event-based representation that defines the geometry of the apparent contour of an object. We represent ACE by a spatially and temporally continuous implicit function defined in the event x-y-t space. Furthermore, we design a novel continuous Voxel Carving algorithm enabled by the high temporal resolution of the Apparent Contour Events. To evaluate the performance of the method, we collect MOEC-3D, a 3D event dataset of a set of common real-world objects. We demonstrate the ability of EvAC3D to reconstruct high-fidelity mesh surfaces from real event sequences while allowing the refinement of the 3D reconstruction for each individual event. ### ELVIS: Empowering Locality of Vision Language Pre-training with Intra-modal Similarity - **Authors:** Sumin Seo, JaeWoong Shin, Jaewoo Kang, Tae Soo Kim, Thijs Kooi - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Computation and Language (cs.CL) - **Arxiv link:** https://arxiv.org/abs/2304.05303 - **Pdf link:** https://arxiv.org/pdf/2304.05303 - **Abstract** Deep learning has shown great potential in assisting radiologists in reading chest X-ray (CXR) images, but its need for expensive annotations for improving performance prevents widespread clinical application. Visual language pre-training (VLP) can alleviate the burden and cost of annotation by leveraging routinely generated reports for radiographs, which exist in large quantities as well as in paired form (imagetext pairs). Additionally, extensions to localization-aware VLPs are being proposed to address the needs of accurate localization of abnormalities for CAD in CXR. However, we find that the formulation proposed by locality-aware VLP literatures actually leads to loss in spatial relationships required for downstream localization tasks. Therefore, we propose Empowering Locality of VLP with Intra-modal Similarity, ELVIS, a VLP aware of intra-modal locality, to better preserve the locality within radiographs or reports, which enhances the ability to comprehend location references in text reports. Our locality-aware VLP method significantly outperforms state-of-the art baselines in multiple segmentation tasks and the MS-CXR phrase grounding task. Qualitatively, ELVIS is able to focus well on regions of interest described in the report text compared to prior approaches, allowing for enhanced interpretability. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Multi-Graph Convolution Network for Pose Forecasting - **Authors:** Hongwei Ren, Yuhong Shi, Kewei Liang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04956 - **Pdf link:** https://arxiv.org/pdf/2304.04956 - **Abstract** Recently, there has been a growing interest in predicting human motion, which involves forecasting future body poses based on observed pose sequences. This task is complex due to modeling spatial and temporal relationships. The most commonly used models for this task are autoregressive models, such as recurrent neural networks (RNNs) or variants, and Transformer Networks. However, RNNs have several drawbacks, such as vanishing or exploding gradients. Other researchers have attempted to solve the communication problem in the spatial dimension by integrating Graph Convolutional Networks (GCN) and Long Short-Term Memory (LSTM) models. These works deal with temporal and spatial information separately, which limits the effectiveness. To fix this problem, we propose a novel approach called the multi-graph convolution network (MGCN) for 3D human pose forecasting. This model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences. Multiple frames give multiple parts, joined together in a single graph instance. Furthermore, we also explore the influence of natural structure and sequence-aware attention to our model. In our experimental evaluation of the large-scale benchmark datasets, Human3.6M, AMSS and 3DPW, MGCN outperforms the state-of-the-art in pose prediction. ## Keyword: ISP ### Estimation of Vehicular Velocity based on Non-Intrusive stereo camera - **Authors:** Bikram Adhikari, Prabin Bhandari - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG) - **Arxiv link:** https://arxiv.org/abs/2304.05298 - **Pdf link:** https://arxiv.org/pdf/2304.05298 - **Abstract** The paper presents a modular approach for the estimation of a leading vehicle's velocity based on a non-intrusive stereo camera where SiamMask is used for leading vehicle tracking, Kernel Density estimate (KDE) is used to smooth the distance prediction from a disparity map, and LightGBM is used for leading vehicle velocity estimation. Our approach yields an RMSE of 0.416 which outperforms the baseline RMSE of 0.582 for the SUBARU Image Recognition Challenge ### Pinpointing Why Object Recognition Performance Degrades Across Income Levels and Geographies - **Authors:** Laura Gustafson, Megan Richards, Melissa Hall, Caner Hazirbas, Diane Bouchacourt, Mark Ibrahim - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05391 - **Pdf link:** https://arxiv.org/pdf/2304.05391 - **Abstract** Despite impressive advances in object-recognition, deep learning systems' performance degrades significantly across geographies and lower income levels raising pressing concerns of inequity. Addressing such performance gaps remains a challenge, as little is understood about why performance degrades across incomes or geographies. We take a step in this direction by annotating images from Dollar Street, a popular benchmark of geographically and economically diverse images, labeling each image with factors such as color, shape, and background. These annotations unlock a new granular view into how objects differ across incomes and regions. We then use these object differences to pinpoint model vulnerabilities across incomes and regions. We study a range of modern vision models, finding that performance disparities are most associated with differences in texture, occlusion, and images with darker lighting. We illustrate how insights from our factor labels can surface mitigations to improve models' performance disparities. As an example, we show that mitigating a model's vulnerability to texture can improve performance on the lower income level. We release all the factor annotations along with an interactive dashboard to facilitate research into more equitable vision systems. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### Multi-scale Fusion Fault Diagnosis Method Based on Two-Dimensionaliztion Sequence in Complex Scenarios - **Authors:** Weiyang Jin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05198 - **Pdf link:** https://arxiv.org/pdf/2304.05198 - **Abstract** Rolling bearings are critical components in rotating machinery, and their faults can cause severe damage. Early detection of abnormalities is crucial to prevent catastrophic accidents. Traditional and intelligent methods have been used to analyze time series data, but in real-life scenarios, sensor data is often noisy and cannot be accurately characterized in the time domain, leading to mode collapse in trained models. Two-dimensionalization methods such as the Gram angle field method (GAF) or interval sampling have been proposed, but they lack mathematical derivation and interpretability. This paper proposes an improved GAF combined with grayscale images for convolution scenarios. The main contributions include illustrating the feasibility of the approach in complex scenarios, widening the data set, and introducing an improved convolutional neural network method with a multi-scale feature fusion diffusion model and deep learning compression techniques for deployment in industrial scenarios. ## Keyword: RAW ### Multi-Object Tracking by Iteratively Associating Detections with Uniform Appearance for Trawl-Based Fishing Bycatch Monitoring - **Authors:** Cheng-Yen Yang, Alan Yu Shyang Tan, Melanie J. Underwood, Charlotte Bodie, Zhongyu Jiang, Steve George, Karl Warr, Jenq-Neng Hwang, Emma Jones - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04816 - **Pdf link:** https://arxiv.org/pdf/2304.04816 - **Abstract** The aim of in-trawl catch monitoring for use in fishing operations is to detect, track and classify fish targets in real-time from video footage. Information gathered could be used to release unwanted bycatch in real-time. However, traditional multi-object tracking (MOT) methods have limitations, as they are developed for tracking vehicles or pedestrians with linear motions and diverse appearances, which are different from the scenarios such as livestock monitoring. Therefore, we propose a novel MOT method, built upon an existing observation-centric tracking algorithm, by adopting a new iterative association step to significantly boost the performance of tracking targets with a uniform appearance. The iterative association module is designed as an extendable component that can be merged into most existing tracking methods. Our method offers improved performance in tracking targets with uniform appearance and outperforms state-of-the-art techniques on our underwater fish datasets as well as the MOT17 dataset, without increasing latency nor sacrificing accuracy as measured by HOTA, MOTA, and IDF1 performance metrics. ### Point-and-Shoot All-in-Focus Photo Synthesis from Smartphone Camera Pair - **Authors:** Xianrui Luo, Juewen Peng, Weiyue Zhao, Ke Xian, Hao Lu, Zhiguo Cao - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04917 - **Pdf link:** https://arxiv.org/pdf/2304.04917 - **Abstract** All-in-Focus (AIF) photography is expected to be a commercial selling point for modern smartphones. Standard AIF synthesis requires manual, time-consuming operations such as focal stack compositing, which is unfriendly to ordinary people. To achieve point-and-shoot AIF photography with a smartphone, we expect that an AIF photo can be generated from one shot of the scene, instead of from multiple photos captured by the same camera. Benefiting from the multi-camera module in modern smartphones, we introduce a new task of AIF synthesis from main (wide) and ultra-wide cameras. The goal is to recover sharp details from defocused regions in the main-camera photo with the help of the ultra-wide-camera one. The camera setting poses new challenges such as parallax-induced occlusions and inconsistent color between cameras. To overcome the challenges, we introduce a predict-and-refine network to mitigate occlusions and propose dynamic frequency-domain alignment for color correction. To enable effective training and evaluation, we also build an AIF dataset with 2686 unique scenes. Each scene includes two photos captured by the main camera, one photo captured by the ultrawide camera, and a synthesized AIF photo. Results show that our solution, termed EasyAIF, can produce high-quality AIF photos and outperforms strong baselines quantitatively and qualitatively. For the first time, we demonstrate point-and-shoot AIF photo synthesis successfully from main and ultra-wide cameras. ### Multi-Graph Convolution Network for Pose Forecasting - **Authors:** Hongwei Ren, Yuhong Shi, Kewei Liang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.04956 - **Pdf link:** https://arxiv.org/pdf/2304.04956 - **Abstract** Recently, there has been a growing interest in predicting human motion, which involves forecasting future body poses based on observed pose sequences. This task is complex due to modeling spatial and temporal relationships. The most commonly used models for this task are autoregressive models, such as recurrent neural networks (RNNs) or variants, and Transformer Networks. However, RNNs have several drawbacks, such as vanishing or exploding gradients. Other researchers have attempted to solve the communication problem in the spatial dimension by integrating Graph Convolutional Networks (GCN) and Long Short-Term Memory (LSTM) models. These works deal with temporal and spatial information separately, which limits the effectiveness. To fix this problem, we propose a novel approach called the multi-graph convolution network (MGCN) for 3D human pose forecasting. This model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences. Multiple frames give multiple parts, joined together in a single graph instance. Furthermore, we also explore the influence of natural structure and sequence-aware attention to our model. In our experimental evaluation of the large-scale benchmark datasets, Human3.6M, AMSS and 3DPW, MGCN outperforms the state-of-the-art in pose prediction. ### One-Shot High-Fidelity Talking-Head Synthesis with Deformable Neural Radiance Field - **Authors:** Weichuang Li, Longhao Zhang, Dong Wang, Bin Zhao, Zhigang Wang, Mulin Chen, Bang Zhang, Zhongjian Wang, Liefeng Bo, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05097 - **Pdf link:** https://arxiv.org/pdf/2304.05097 - **Abstract** Talking head generation aims to generate faces that maintain the identity information of the source image and imitate the motion of the driving image. Most pioneering methods rely primarily on 2D representations and thus will inevitably suffer from face distortion when large head rotations are encountered. Recent works instead employ explicit 3D structural representations or implicit neural rendering to improve performance under large pose changes. Nevertheless, the fidelity of identity and expression is not so desirable, especially for novel-view synthesis. In this paper, we propose HiDe-NeRF, which achieves high-fidelity and free-view talking-head synthesis. Drawing on the recently proposed Deformable Neural Radiance Fields, HiDe-NeRF represents the 3D dynamic scene into a canonical appearance field and an implicit deformation field, where the former comprises the canonical source face and the latter models the driving pose and expression. In particular, we improve fidelity from two aspects: (i) to enhance identity expressiveness, we design a generalized appearance module that leverages multi-scale volume features to preserve face shape and details; (ii) to improve expression preciseness, we propose a lightweight deformation module that explicitly decouples the pose and expression to enable precise expression modeling. Extensive experiments demonstrate that our proposed approach can generate better results than previous works. Project page: https://www.waytron.net/hidenerf/ ### Astroformer: More Data Might Not be All You Need for Classification - **Authors:** Rishit Dagli - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2304.05350 - **Pdf link:** https://arxiv.org/pdf/2304.05350 - **Abstract** Recent advancements in areas such as natural language processing and computer vision rely on intricate and massive models that have been trained using vast amounts of unlabelled or partly labeled data and training or deploying these state-of-the-art methods to resource constraint environments has been a challenge. Galaxy morphologies are crucial to understanding the processes by which galaxies form and evolve. Efficient methods to classify galaxy morphologies are required to extract physical information from modern-day astronomy surveys. In this paper, we introduce methods to learn from less amounts of data. We propose using a hybrid transformer-convolutional architecture drawing much inspiration from the success of CoAtNet and MaxViT. Concretely, we use the transformer-convolutional hybrid with a new stack design for the network, a different way of creating a relative self-attention layer, and pair it with a careful selection of data augmentation and regularization techniques. Our approach sets a new state-of-the-art on predicting galaxy morphologies from images on the Galaxy10 DECals dataset, a science objective, which consists of 17736 labeled images achieving $94.86\%$ top-$1$ accuracy, beating the current state-of-the-art for this task by $4.62\%$. Furthermore, this approach also sets a new state-of-the-art on CIFAR-100 and Tiny ImageNet. We also find that models and training methods used for larger datasets would often not work very well in the low-data regime. Our code and models will be released at a later date before the conference. ## Keyword: raw image There is no result
process
new submissions for wed apr keyword events from event based apparent contours to models via continuous visual hulls authors ziyun wang kenneth chaney kostas daniilidis subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract reconstruction from multiple views is a successful computer vision field with multiple deployments in applications state of the art is based on traditional rgb frames that enable optimization of photo consistency cross views in this paper we study the problem of reconstruction from event cameras motivated by the advantages of event based cameras in terms of low power and latency as well as by the biological evidence that eyes in nature capture the same data and still perceive well shape the foundation of our hypothesis that reconstruction is feasible using events lies in the information contained in the occluding contours and in the continuous scene acquisition with events we propose apparent contour events ace a novel event based representation that defines the geometry of the apparent contour of an object we represent ace by a spatially and temporally continuous implicit function defined in the event x y t space furthermore we design a novel continuous voxel carving algorithm enabled by the high temporal resolution of the apparent contour events to evaluate the performance of the method we collect moec a event dataset of a set of common real world objects we demonstrate the ability of to reconstruct high fidelity mesh surfaces from real event sequences while allowing the refinement of the reconstruction for each individual event elvis empowering locality of vision language pre training with intra modal similarity authors sumin seo jaewoong shin jaewoo kang tae soo kim thijs kooi subjects computer vision and pattern recognition cs cv computation and language cs cl arxiv link pdf link abstract deep learning has shown great potential in assisting radiologists in reading chest x ray cxr images but its need for expensive annotations for improving performance prevents widespread clinical application visual language pre training vlp can alleviate the burden and cost of annotation by leveraging routinely generated reports for radiographs which exist in large quantities as well as in paired form imagetext pairs additionally extensions to localization aware vlps are being proposed to address the needs of accurate localization of abnormalities for cad in cxr however we find that the formulation proposed by locality aware vlp literatures actually leads to loss in spatial relationships required for downstream localization tasks therefore we propose empowering locality of vlp with intra modal similarity elvis a vlp aware of intra modal locality to better preserve the locality within radiographs or reports which enhances the ability to comprehend location references in text reports our locality aware vlp method significantly outperforms state of the art baselines in multiple segmentation tasks and the ms cxr phrase grounding task qualitatively elvis is able to focus well on regions of interest described in the report text compared to prior approaches allowing for enhanced interpretability keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb multi graph convolution network for pose forecasting authors hongwei ren yuhong shi kewei liang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recently there has been a growing interest in predicting human motion which involves forecasting future body poses based on observed pose sequences this task is complex due to modeling spatial and temporal relationships the most commonly used models for this task are autoregressive models such as recurrent neural networks rnns or variants and transformer networks however rnns have several drawbacks such as vanishing or exploding gradients other researchers have attempted to solve the communication problem in the spatial dimension by integrating graph convolutional networks gcn and long short term memory lstm models these works deal with temporal and spatial information separately which limits the effectiveness to fix this problem we propose a novel approach called the multi graph convolution network mgcn for human pose forecasting this model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences multiple frames give multiple parts joined together in a single graph instance furthermore we also explore the influence of natural structure and sequence aware attention to our model in our experimental evaluation of the large scale benchmark datasets amss and mgcn outperforms the state of the art in pose prediction keyword isp estimation of vehicular velocity based on non intrusive stereo camera authors bikram adhikari prabin bhandari subjects computer vision and pattern recognition cs cv machine learning cs lg arxiv link pdf link abstract the paper presents a modular approach for the estimation of a leading vehicle s velocity based on a non intrusive stereo camera where siammask is used for leading vehicle tracking kernel density estimate kde is used to smooth the distance prediction from a disparity map and lightgbm is used for leading vehicle velocity estimation our approach yields an rmse of which outperforms the baseline rmse of for the subaru image recognition challenge pinpointing why object recognition performance degrades across income levels and geographies authors laura gustafson megan richards melissa hall caner hazirbas diane bouchacourt mark ibrahim subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract despite impressive advances in object recognition deep learning systems performance degrades significantly across geographies and lower income levels raising pressing concerns of inequity addressing such performance gaps remains a challenge as little is understood about why performance degrades across incomes or geographies we take a step in this direction by annotating images from dollar street a popular benchmark of geographically and economically diverse images labeling each image with factors such as color shape and background these annotations unlock a new granular view into how objects differ across incomes and regions we then use these object differences to pinpoint model vulnerabilities across incomes and regions we study a range of modern vision models finding that performance disparities are most associated with differences in texture occlusion and images with darker lighting we illustrate how insights from our factor labels can surface mitigations to improve models performance disparities as an example we show that mitigating a model s vulnerability to texture can improve performance on the lower income level we release all the factor annotations along with an interactive dashboard to facilitate research into more equitable vision systems keyword image signal processing there is no result keyword image signal process there is no result keyword compression multi scale fusion fault diagnosis method based on two dimensionaliztion sequence in complex scenarios authors weiyang jin subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract rolling bearings are critical components in rotating machinery and their faults can cause severe damage early detection of abnormalities is crucial to prevent catastrophic accidents traditional and intelligent methods have been used to analyze time series data but in real life scenarios sensor data is often noisy and cannot be accurately characterized in the time domain leading to mode collapse in trained models two dimensionalization methods such as the gram angle field method gaf or interval sampling have been proposed but they lack mathematical derivation and interpretability this paper proposes an improved gaf combined with grayscale images for convolution scenarios the main contributions include illustrating the feasibility of the approach in complex scenarios widening the data set and introducing an improved convolutional neural network method with a multi scale feature fusion diffusion model and deep learning compression techniques for deployment in industrial scenarios keyword raw multi object tracking by iteratively associating detections with uniform appearance for trawl based fishing bycatch monitoring authors cheng yen yang alan yu shyang tan melanie j underwood charlotte bodie zhongyu jiang steve george karl warr jenq neng hwang emma jones subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract the aim of in trawl catch monitoring for use in fishing operations is to detect track and classify fish targets in real time from video footage information gathered could be used to release unwanted bycatch in real time however traditional multi object tracking mot methods have limitations as they are developed for tracking vehicles or pedestrians with linear motions and diverse appearances which are different from the scenarios such as livestock monitoring therefore we propose a novel mot method built upon an existing observation centric tracking algorithm by adopting a new iterative association step to significantly boost the performance of tracking targets with a uniform appearance the iterative association module is designed as an extendable component that can be merged into most existing tracking methods our method offers improved performance in tracking targets with uniform appearance and outperforms state of the art techniques on our underwater fish datasets as well as the dataset without increasing latency nor sacrificing accuracy as measured by hota mota and performance metrics point and shoot all in focus photo synthesis from smartphone camera pair authors xianrui luo juewen peng weiyue zhao ke xian hao lu zhiguo cao subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract all in focus aif photography is expected to be a commercial selling point for modern smartphones standard aif synthesis requires manual time consuming operations such as focal stack compositing which is unfriendly to ordinary people to achieve point and shoot aif photography with a smartphone we expect that an aif photo can be generated from one shot of the scene instead of from multiple photos captured by the same camera benefiting from the multi camera module in modern smartphones we introduce a new task of aif synthesis from main wide and ultra wide cameras the goal is to recover sharp details from defocused regions in the main camera photo with the help of the ultra wide camera one the camera setting poses new challenges such as parallax induced occlusions and inconsistent color between cameras to overcome the challenges we introduce a predict and refine network to mitigate occlusions and propose dynamic frequency domain alignment for color correction to enable effective training and evaluation we also build an aif dataset with unique scenes each scene includes two photos captured by the main camera one photo captured by the ultrawide camera and a synthesized aif photo results show that our solution termed easyaif can produce high quality aif photos and outperforms strong baselines quantitatively and qualitatively for the first time we demonstrate point and shoot aif photo synthesis successfully from main and ultra wide cameras multi graph convolution network for pose forecasting authors hongwei ren yuhong shi kewei liang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recently there has been a growing interest in predicting human motion which involves forecasting future body poses based on observed pose sequences this task is complex due to modeling spatial and temporal relationships the most commonly used models for this task are autoregressive models such as recurrent neural networks rnns or variants and transformer networks however rnns have several drawbacks such as vanishing or exploding gradients other researchers have attempted to solve the communication problem in the spatial dimension by integrating graph convolutional networks gcn and long short term memory lstm models these works deal with temporal and spatial information separately which limits the effectiveness to fix this problem we propose a novel approach called the multi graph convolution network mgcn for human pose forecasting this model simultaneously captures spatial and temporal information by introducing an augmented graph for pose sequences multiple frames give multiple parts joined together in a single graph instance furthermore we also explore the influence of natural structure and sequence aware attention to our model in our experimental evaluation of the large scale benchmark datasets amss and mgcn outperforms the state of the art in pose prediction one shot high fidelity talking head synthesis with deformable neural radiance field authors weichuang li longhao zhang dong wang bin zhao zhigang wang mulin chen bang zhang zhongjian wang liefeng bo xuelong li subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract talking head generation aims to generate faces that maintain the identity information of the source image and imitate the motion of the driving image most pioneering methods rely primarily on representations and thus will inevitably suffer from face distortion when large head rotations are encountered recent works instead employ explicit structural representations or implicit neural rendering to improve performance under large pose changes nevertheless the fidelity of identity and expression is not so desirable especially for novel view synthesis in this paper we propose hide nerf which achieves high fidelity and free view talking head synthesis drawing on the recently proposed deformable neural radiance fields hide nerf represents the dynamic scene into a canonical appearance field and an implicit deformation field where the former comprises the canonical source face and the latter models the driving pose and expression in particular we improve fidelity from two aspects i to enhance identity expressiveness we design a generalized appearance module that leverages multi scale volume features to preserve face shape and details ii to improve expression preciseness we propose a lightweight deformation module that explicitly decouples the pose and expression to enable precise expression modeling extensive experiments demonstrate that our proposed approach can generate better results than previous works project page astroformer more data might not be all you need for classification authors rishit dagli subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recent advancements in areas such as natural language processing and computer vision rely on intricate and massive models that have been trained using vast amounts of unlabelled or partly labeled data and training or deploying these state of the art methods to resource constraint environments has been a challenge galaxy morphologies are crucial to understanding the processes by which galaxies form and evolve efficient methods to classify galaxy morphologies are required to extract physical information from modern day astronomy surveys in this paper we introduce methods to learn from less amounts of data we propose using a hybrid transformer convolutional architecture drawing much inspiration from the success of coatnet and maxvit concretely we use the transformer convolutional hybrid with a new stack design for the network a different way of creating a relative self attention layer and pair it with a careful selection of data augmentation and regularization techniques our approach sets a new state of the art on predicting galaxy morphologies from images on the decals dataset a science objective which consists of labeled images achieving top accuracy beating the current state of the art for this task by furthermore this approach also sets a new state of the art on cifar and tiny imagenet we also find that models and training methods used for larger datasets would often not work very well in the low data regime our code and models will be released at a later date before the conference keyword raw image there is no result
1
13,573
16,109,432,362
IssuesEvent
2021-04-27 19:01:43
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[feature][processing] New algorithm "Align points to features"
3.16 Automatic new feature Processing Alg
Original commit: https://github.com/qgis/QGIS/commit/95bd7b296bd6516fb09182da5ec4c74f5b962d93 by nyalldawson This algorithm calculates the rotation required to align point features with their nearest feature from another reference layer. A new field is added to the output layer which is filled with the angle (in degrees, clockwise) to the nearest reference feature. Optionally, the output layer's symbology can be set to automatically use the calculated rotation field to rotate marker symbols. If desired, a maximum distance to use when aligning points can be set, to avoid aligning isolated points to distant features. Designed for use cases like aligning building point symbols to follow the nearest road direction!
1.0
[feature][processing] New algorithm "Align points to features" - Original commit: https://github.com/qgis/QGIS/commit/95bd7b296bd6516fb09182da5ec4c74f5b962d93 by nyalldawson This algorithm calculates the rotation required to align point features with their nearest feature from another reference layer. A new field is added to the output layer which is filled with the angle (in degrees, clockwise) to the nearest reference feature. Optionally, the output layer's symbology can be set to automatically use the calculated rotation field to rotate marker symbols. If desired, a maximum distance to use when aligning points can be set, to avoid aligning isolated points to distant features. Designed for use cases like aligning building point symbols to follow the nearest road direction!
process
new algorithm align points to features original commit by nyalldawson this algorithm calculates the rotation required to align point features with their nearest feature from another reference layer a new field is added to the output layer which is filled with the angle in degrees clockwise to the nearest reference feature optionally the output layer s symbology can be set to automatically use the calculated rotation field to rotate marker symbols if desired a maximum distance to use when aligning points can be set to avoid aligning isolated points to distant features designed for use cases like aligning building point symbols to follow the nearest road direction
1
2,580
5,343,268,691
IssuesEvent
2017-02-17 10:48:29
jlm2017/jlm-video-subtitles
https://api.github.com/repos/jlm2017/jlm-video-subtitles
opened
[subtitles] [eng] NOUVEAU TRAITÉ EUROPÉEN : LA FINANCE AVANT LE PEUPLE !
Language: English Process: [1] Writing in progress
# Video title NOUVEAU TRAITÉ EUROPÉEN : LA FINANCE AVANT LE PEUPLE ! # URL https://www.youtube.com/watch?v=YlBapm_yXRI # Youtube subtitles language Anglais # Duration 01:04 # Subtitles URL https://www.youtube.com/timedtext_editor?action_mde_edit_form=1&tab=captions&lang=en&v=YlBapm_yXRI&bl=vmp&ui=hd&ref=player
1.0
[subtitles] [eng] NOUVEAU TRAITÉ EUROPÉEN : LA FINANCE AVANT LE PEUPLE ! - # Video title NOUVEAU TRAITÉ EUROPÉEN : LA FINANCE AVANT LE PEUPLE ! # URL https://www.youtube.com/watch?v=YlBapm_yXRI # Youtube subtitles language Anglais # Duration 01:04 # Subtitles URL https://www.youtube.com/timedtext_editor?action_mde_edit_form=1&tab=captions&lang=en&v=YlBapm_yXRI&bl=vmp&ui=hd&ref=player
process
nouveau traité européen la finance avant le peuple video title nouveau traité européen la finance avant le peuple url youtube subtitles language anglais duration subtitles url
1
626,152
19,786,473,721
IssuesEvent
2022-01-18 07:30:54
milvus-io/milvus
https://api.github.com/repos/milvus-io/milvus
closed
[Bug]: [CI] Connect get TypeError: __init__() missing 1 required positional argument: 'message'
kind/bug priority/urgent
### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version: - Deployment mode(standalone or cluster): cluster - SDK version(e.g. pymilvus v2.0.0rc2): pymilvus==2.0.0rc10.dev6 - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior https://ci.milvus.io:18080/jenkins/blue/organizations/jenkins/milvus-ha-ci/detail/PR-15226/2/pipeline ``` =================================== FAILURES =================================== [2022-01-15T01:58:56.212Z] ___ TestCollectionSearch.test_search_with_dup_primary_key[128-True-False-3] ____ [2022-01-15T01:58:56.212Z] [gw0] linux -- Python 3.6.8 /usr/local/bin/python3 [2022-01-15T01:58:56.212Z] [gw0] linux -- Python 3.6.8 /usr/local/bin/python3[gw0] linux -- Python 3.6.8 /usr/local/bin/python3 [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] self = <test_search_20.TestCollectionSearch object at 0x7fe4c93ee208>, dim = 128 [2022-01-15T01:58:56.212Z] auto_id = True, _async = False, dup_times = 3 [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] @pytest.mark.tags(CaseLabel.L1) [2022-01-15T01:58:56.212Z] @pytest.mark.parametrize("dup_times", [1, 2, 3]) [2022-01-15T01:58:56.212Z] def test_search_with_dup_primary_key(self, dim, auto_id, _async, dup_times): [2022-01-15T01:58:56.212Z] """ [2022-01-15T01:58:56.212Z] target: test search with duplicate primary key [2022-01-15T01:58:56.212Z] method: 1.insert same data twice [2022-01-15T01:58:56.212Z] 2.search [2022-01-15T01:58:56.212Z] expected: search results are de-duplicated [2022-01-15T01:58:56.212Z] """ [2022-01-15T01:58:56.212Z] # initialize with data [2022-01-15T01:58:56.212Z] nb = ct.default_nb [2022-01-15T01:58:56.212Z] nq = ct.default_nq [2022-01-15T01:58:56.212Z] collection_w, insert_data, _, insert_ids = self.init_collection_general(prefix, True, nb, [2022-01-15T01:58:56.212Z] auto_id=auto_id, [2022-01-15T01:58:56.212Z] > dim=dim)[0:4] [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] testcases/test_search_20.py:889: [2022-01-15T01:58:56.212Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-01-15T01:58:56.212Z] base/client_base.py:150: in init_collection_general [2022-01-15T01:58:56.212Z] self._connect() [2022-01-15T01:58:56.212Z] base/client_base.py:104: in _connect [2022-01-15T01:58:56.212Z] port=param_info.param_port) [2022-01-15T01:58:56.212Z] base/connections_wrapper.py:35: in connect [2022-01-15T01:58:56.212Z] check_result = ResponseChecker(response, func_name, check_task, check_items, succ, alias=alias, **kwargs).run() [2022-01-15T01:58:56.212Z] check/func_check.py:31: in run [2022-01-15T01:58:56.212Z] result = self.assert_succ(self.succ, True) [2022-01-15T01:58:56.212Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] actual = False, expect = True [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] @staticmethod [2022-01-15T01:58:56.212Z] def assert_succ(actual, expect): [2022-01-15T01:58:56.213Z] > assert actual is expect [2022-01-15T01:58:56.213Z] E AssertionError [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] check/func_check.py:73: AssertionError [2022-01-15T01:58:56.213Z] ------------------------------ Captured log setup ------------------------------ [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: *********************************** setup *********************************** (client_base.py:51) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: [setup_method] Start setup test case test_search_with_dup_primary_key. (client_base.py:52) [2022-01-15T01:58:56.213Z] ------------------------------ Captured log call ------------------------------- [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: Test case of search interface: initialize before test case (client_base.py:149) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - DEBUG - ci_test]: (api_request) : [Connections.connect] args: ['default'], kwargs: {'host': 'ms-15226-2-pr-milvus.milvus-ci', 'port': '19530'} (api_request.py:55) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:45 - ERROR - ci_test]: Traceback (most recent call last): [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 90, in _wait_for_channel_ready [2022-01-15T01:58:56.213Z] grpc.channel_ready_future(self._channel).result(timeout=3) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/grpc/_utilities.py", line 140, in result [2022-01-15T01:58:56.213Z] self._block(timeout) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/grpc/_utilities.py", line 86, in _block [2022-01-15T01:58:56.213Z] raise grpc.FutureTimeoutError() [2022-01-15T01:58:56.213Z] grpc.FutureTimeoutError [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] During handling of the above exception, another exception occurred: [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] Traceback (most recent call last): [2022-01-15T01:58:56.213Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 22, in inner_wrapper [2022-01-15T01:58:56.213Z] res = func(*args, **kwargs) [2022-01-15T01:58:56.213Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 56, in api_request [2022-01-15T01:58:56.213Z] return func(*arg, **kwargs) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/connections.py", line 156, in connect [2022-01-15T01:58:56.213Z] conn = connect_milvus(**kwargs) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/connections.py", line 144, in connect_milvus [2022-01-15T01:58:56.213Z] gh._wait_for_channel_ready() [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 93, in _wait_for_channel_ready [2022-01-15T01:58:56.213Z] raise BaseException(f'Fail connecting to server on {self._uri}. Timeout') [2022-01-15T01:58:56.213Z] TypeError: __init__() missing 1 required positional argument: 'message' [2022-01-15T01:58:56.213Z] (api_request.py:35) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:45 - ERROR - ci_test]: (api_response) : __init__() missing 1 required positional argument: 'message' (api_request.py:36) ``` ### Expected Behavior _No response_ ### Steps To Reproduce _No response_ ### Anything else? _No response_
1.0
[Bug]: [CI] Connect get TypeError: __init__() missing 1 required positional argument: 'message' - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version: - Deployment mode(standalone or cluster): cluster - SDK version(e.g. pymilvus v2.0.0rc2): pymilvus==2.0.0rc10.dev6 - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior https://ci.milvus.io:18080/jenkins/blue/organizations/jenkins/milvus-ha-ci/detail/PR-15226/2/pipeline ``` =================================== FAILURES =================================== [2022-01-15T01:58:56.212Z] ___ TestCollectionSearch.test_search_with_dup_primary_key[128-True-False-3] ____ [2022-01-15T01:58:56.212Z] [gw0] linux -- Python 3.6.8 /usr/local/bin/python3 [2022-01-15T01:58:56.212Z] [gw0] linux -- Python 3.6.8 /usr/local/bin/python3[gw0] linux -- Python 3.6.8 /usr/local/bin/python3 [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] self = <test_search_20.TestCollectionSearch object at 0x7fe4c93ee208>, dim = 128 [2022-01-15T01:58:56.212Z] auto_id = True, _async = False, dup_times = 3 [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] @pytest.mark.tags(CaseLabel.L1) [2022-01-15T01:58:56.212Z] @pytest.mark.parametrize("dup_times", [1, 2, 3]) [2022-01-15T01:58:56.212Z] def test_search_with_dup_primary_key(self, dim, auto_id, _async, dup_times): [2022-01-15T01:58:56.212Z] """ [2022-01-15T01:58:56.212Z] target: test search with duplicate primary key [2022-01-15T01:58:56.212Z] method: 1.insert same data twice [2022-01-15T01:58:56.212Z] 2.search [2022-01-15T01:58:56.212Z] expected: search results are de-duplicated [2022-01-15T01:58:56.212Z] """ [2022-01-15T01:58:56.212Z] # initialize with data [2022-01-15T01:58:56.212Z] nb = ct.default_nb [2022-01-15T01:58:56.212Z] nq = ct.default_nq [2022-01-15T01:58:56.212Z] collection_w, insert_data, _, insert_ids = self.init_collection_general(prefix, True, nb, [2022-01-15T01:58:56.212Z] auto_id=auto_id, [2022-01-15T01:58:56.212Z] > dim=dim)[0:4] [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] testcases/test_search_20.py:889: [2022-01-15T01:58:56.212Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-01-15T01:58:56.212Z] base/client_base.py:150: in init_collection_general [2022-01-15T01:58:56.212Z] self._connect() [2022-01-15T01:58:56.212Z] base/client_base.py:104: in _connect [2022-01-15T01:58:56.212Z] port=param_info.param_port) [2022-01-15T01:58:56.212Z] base/connections_wrapper.py:35: in connect [2022-01-15T01:58:56.212Z] check_result = ResponseChecker(response, func_name, check_task, check_items, succ, alias=alias, **kwargs).run() [2022-01-15T01:58:56.212Z] check/func_check.py:31: in run [2022-01-15T01:58:56.212Z] result = self.assert_succ(self.succ, True) [2022-01-15T01:58:56.212Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] actual = False, expect = True [2022-01-15T01:58:56.212Z] [2022-01-15T01:58:56.212Z] @staticmethod [2022-01-15T01:58:56.212Z] def assert_succ(actual, expect): [2022-01-15T01:58:56.213Z] > assert actual is expect [2022-01-15T01:58:56.213Z] E AssertionError [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] check/func_check.py:73: AssertionError [2022-01-15T01:58:56.213Z] ------------------------------ Captured log setup ------------------------------ [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: *********************************** setup *********************************** (client_base.py:51) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: [setup_method] Start setup test case test_search_with_dup_primary_key. (client_base.py:52) [2022-01-15T01:58:56.213Z] ------------------------------ Captured log call ------------------------------- [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - INFO - ci_test]: Test case of search interface: initialize before test case (client_base.py:149) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:42 - DEBUG - ci_test]: (api_request) : [Connections.connect] args: ['default'], kwargs: {'host': 'ms-15226-2-pr-milvus.milvus-ci', 'port': '19530'} (api_request.py:55) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:45 - ERROR - ci_test]: Traceback (most recent call last): [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 90, in _wait_for_channel_ready [2022-01-15T01:58:56.213Z] grpc.channel_ready_future(self._channel).result(timeout=3) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/grpc/_utilities.py", line 140, in result [2022-01-15T01:58:56.213Z] self._block(timeout) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/grpc/_utilities.py", line 86, in _block [2022-01-15T01:58:56.213Z] raise grpc.FutureTimeoutError() [2022-01-15T01:58:56.213Z] grpc.FutureTimeoutError [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] During handling of the above exception, another exception occurred: [2022-01-15T01:58:56.213Z] [2022-01-15T01:58:56.213Z] Traceback (most recent call last): [2022-01-15T01:58:56.213Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 22, in inner_wrapper [2022-01-15T01:58:56.213Z] res = func(*args, **kwargs) [2022-01-15T01:58:56.213Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 56, in api_request [2022-01-15T01:58:56.213Z] return func(*arg, **kwargs) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/connections.py", line 156, in connect [2022-01-15T01:58:56.213Z] conn = connect_milvus(**kwargs) [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/connections.py", line 144, in connect_milvus [2022-01-15T01:58:56.213Z] gh._wait_for_channel_ready() [2022-01-15T01:58:56.213Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 93, in _wait_for_channel_ready [2022-01-15T01:58:56.213Z] raise BaseException(f'Fail connecting to server on {self._uri}. Timeout') [2022-01-15T01:58:56.213Z] TypeError: __init__() missing 1 required positional argument: 'message' [2022-01-15T01:58:56.213Z] (api_request.py:35) [2022-01-15T01:58:56.213Z] [2022-01-15 01:58:45 - ERROR - ci_test]: (api_response) : __init__() missing 1 required positional argument: 'message' (api_request.py:36) ``` ### Expected Behavior _No response_ ### Steps To Reproduce _No response_ ### Anything else? _No response_
non_process
connect get typeerror init missing required positional argument message is there an existing issue for this i have searched the existing issues environment markdown milvus version deployment mode standalone or cluster cluster sdk version e g pymilvus pymilvus os ubuntu or centos cpu memory gpu others current behavior failures testcollectionsearch test search with dup primary key linux python usr local bin linux python usr local bin linux python usr local bin self dim auto id true async false dup times pytest mark tags caselabel pytest mark parametrize dup times def test search with dup primary key self dim auto id async dup times target test search with duplicate primary key method insert same data twice search expected search results are de duplicated initialize with data nb ct default nb nq ct default nq collection w insert data insert ids self init collection general prefix true nb auto id auto id dim dim testcases test search py base client base py in init collection general self connect base client base py in connect port param info param port base connections wrapper py in connect check result responsechecker response func name check task check items succ alias alias kwargs run check func check py in run result self assert succ self succ true actual false expect true staticmethod def assert succ actual expect assert actual is expect e assertionerror check func check py assertionerror captured log setup setup client base py start setup test case test search with dup primary key client base py captured log call test case of search interface initialize before test case client base py api request args kwargs host ms pr milvus milvus ci port api request py traceback most recent call last file usr local lib site packages pymilvus client grpc handler py line in wait for channel ready grpc channel ready future self channel result timeout file usr local lib site packages grpc utilities py line in result self block timeout file usr local lib site packages grpc utilities py line in block raise grpc futuretimeouterror grpc futuretimeouterror during handling of the above exception another exception occurred traceback most recent call last file home jenkins agent workspace tests python client utils api request py line in inner wrapper res func args kwargs file home jenkins agent workspace tests python client utils api request py line in api request return func arg kwargs file usr local lib site packages pymilvus orm connections py line in connect conn connect milvus kwargs file usr local lib site packages pymilvus orm connections py line in connect milvus gh wait for channel ready file usr local lib site packages pymilvus client grpc handler py line in wait for channel ready raise baseexception f fail connecting to server on self uri timeout typeerror init missing required positional argument message api request py api response init missing required positional argument message api request py expected behavior no response steps to reproduce no response anything else no response
0
12,326
7,852,409,497
IssuesEvent
2018-06-20 14:32:35
chapel-lang/chapel
https://api.github.com/repos/chapel-lang/chapel
closed
Optimize blocking network atomic optimizations
type: Performance
Even for the blocking histogram version of bale we lag behind the reference version by ~20%. See what we can do to optimize blocking network atomics. Some potential ideas: - [x] avoid duplicate `mreg_for_remote_addr()` calls (#9868) - [x] specialize`do_nic_amo()` for fetching vs non-fetching to avoid extra branches (#9873) - [x] specialize `do_nic_amo()` for 4 vs 8 byte to avoid extra branches? (#9873) - [x] create a specialized `post_fma_and_wait_amo()` that doesn't yield with every attempt (#9876)
True
Optimize blocking network atomic optimizations - Even for the blocking histogram version of bale we lag behind the reference version by ~20%. See what we can do to optimize blocking network atomics. Some potential ideas: - [x] avoid duplicate `mreg_for_remote_addr()` calls (#9868) - [x] specialize`do_nic_amo()` for fetching vs non-fetching to avoid extra branches (#9873) - [x] specialize `do_nic_amo()` for 4 vs 8 byte to avoid extra branches? (#9873) - [x] create a specialized `post_fma_and_wait_amo()` that doesn't yield with every attempt (#9876)
non_process
optimize blocking network atomic optimizations even for the blocking histogram version of bale we lag behind the reference version by see what we can do to optimize blocking network atomics some potential ideas avoid duplicate mreg for remote addr calls specialize do nic amo for fetching vs non fetching to avoid extra branches specialize do nic amo for vs byte to avoid extra branches create a specialized post fma and wait amo that doesn t yield with every attempt
0
9,154
12,214,963,624
IssuesEvent
2020-05-01 11:31:26
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Autocomplete algorithms in Processing Modeller
Feature Request Processing
Author Name: **Tom Chadwin** (@tomchadwin) Original Redmine Issue: [19284](https://issues.qgis.org/issues/19284) Redmine category:processing/modeller --- This is a UX suggestion from an FME user: "I use the modeler quite a lot and I must admit I find myself typing in the model canvas expecting to see a list of processing tools come up like they do in FME sometimes. GUI improvements and customisation for the modeler would be lovely."
1.0
Autocomplete algorithms in Processing Modeller - Author Name: **Tom Chadwin** (@tomchadwin) Original Redmine Issue: [19284](https://issues.qgis.org/issues/19284) Redmine category:processing/modeller --- This is a UX suggestion from an FME user: "I use the modeler quite a lot and I must admit I find myself typing in the model canvas expecting to see a list of processing tools come up like they do in FME sometimes. GUI improvements and customisation for the modeler would be lovely."
process
autocomplete algorithms in processing modeller author name tom chadwin tomchadwin original redmine issue redmine category processing modeller this is a ux suggestion from an fme user i use the modeler quite a lot and i must admit i find myself typing in the model canvas expecting to see a list of processing tools come up like they do in fme sometimes gui improvements and customisation for the modeler would be lovely
1
430,714
12,464,808,826
IssuesEvent
2020-05-28 13:06:26
UltimateCodeMonkeys/CodeMonkeysMVVM
https://api.github.com/repos/UltimateCodeMonkeys/CodeMonkeysMVVM
closed
Support modal navigation for Xamarin.Forms
Navigation.Forms Priority: High Type: Enhancement
Modal navigation is currently not supported due to the interface being platform-independent.
1.0
Support modal navigation for Xamarin.Forms - Modal navigation is currently not supported due to the interface being platform-independent.
non_process
support modal navigation for xamarin forms modal navigation is currently not supported due to the interface being platform independent
0
3,874
6,812,087,744
IssuesEvent
2017-11-06 00:16:19
learn-anything/maps
https://api.github.com/repos/learn-anything/maps
closed
best path for learning sentiment analysis
natural language processing study plan
Take a look [here](https://my.mindnode.com/BTNhqptfyxynrT4NFoaLAysgHnkKgpC98YGuKqRA#148.7,-175.8,2). If you think there is a better way one can learn sentiment analysis or you think the way the nodes are structured is wrong, please say it here. Also if you think there are some really amazing resources on sentiment analysis that are missing or you wish something was added, you can say it here.
1.0
best path for learning sentiment analysis - Take a look [here](https://my.mindnode.com/BTNhqptfyxynrT4NFoaLAysgHnkKgpC98YGuKqRA#148.7,-175.8,2). If you think there is a better way one can learn sentiment analysis or you think the way the nodes are structured is wrong, please say it here. Also if you think there are some really amazing resources on sentiment analysis that are missing or you wish something was added, you can say it here.
process
best path for learning sentiment analysis take a look if you think there is a better way one can learn sentiment analysis or you think the way the nodes are structured is wrong please say it here also if you think there are some really amazing resources on sentiment analysis that are missing or you wish something was added you can say it here
1
207,101
15,792,672,092
IssuesEvent
2021-04-02 07:33:45
itzg/docker-minecraft-server
https://api.github.com/repos/itzg/docker-minecraft-server
closed
Debian-based images need git to build Spigot from source
status/ready to test
I'm trying to run this in Azure but [this bizarre issue](https://gitlab.alpinelinux.org/alpine/aports/-/issues/10960) is causing problems with the Alpine-based images so I'm trying Debian instead. That's not working either though because it's missing git, which is needed to build Spigot from source. ``` [init] Changing ownership of /data to 1000 ... [init] Running as uid=1000 gid=1000 with /data as 'drwxrwxrwx 2 0 0 0 Mar 30 09:36 /data' [init] Resolved version given LATEST into 1.16.5 [init] Resolving type given SPIGOT [init] Building Spigot 1.16.5 from source, might take a while, get some coffee [init] ....[init] done mv: cannot stat 'spigot-*.jar': No such file or directory [init] ERR failed to build Spigot Loading BuildTools version: git-BuildTools-36c11e9-126 (#126) Java Version: Java 11 Current Path: /data/temp/. Could not successfully run git. Please ensure it is installed and functioning. Cannot run program "git" (in directory "."): error=2, No such file or directory ```
1.0
Debian-based images need git to build Spigot from source - I'm trying to run this in Azure but [this bizarre issue](https://gitlab.alpinelinux.org/alpine/aports/-/issues/10960) is causing problems with the Alpine-based images so I'm trying Debian instead. That's not working either though because it's missing git, which is needed to build Spigot from source. ``` [init] Changing ownership of /data to 1000 ... [init] Running as uid=1000 gid=1000 with /data as 'drwxrwxrwx 2 0 0 0 Mar 30 09:36 /data' [init] Resolved version given LATEST into 1.16.5 [init] Resolving type given SPIGOT [init] Building Spigot 1.16.5 from source, might take a while, get some coffee [init] ....[init] done mv: cannot stat 'spigot-*.jar': No such file or directory [init] ERR failed to build Spigot Loading BuildTools version: git-BuildTools-36c11e9-126 (#126) Java Version: Java 11 Current Path: /data/temp/. Could not successfully run git. Please ensure it is installed and functioning. Cannot run program "git" (in directory "."): error=2, No such file or directory ```
non_process
debian based images need git to build spigot from source i m trying to run this in azure but is causing problems with the alpine based images so i m trying debian instead that s not working either though because it s missing git which is needed to build spigot from source changing ownership of data to running as uid gid with data as drwxrwxrwx mar data resolved version given latest into resolving type given spigot building spigot from source might take a while get some coffee done mv cannot stat spigot jar no such file or directory err failed to build spigot loading buildtools version git buildtools java version java current path data temp could not successfully run git please ensure it is installed and functioning cannot run program git in directory error no such file or directory
0
13,535
16,067,041,986
IssuesEvent
2021-04-23 20:59:12
Jeffail/benthos
https://api.github.com/repos/Jeffail/benthos
closed
Raw messages beginning with a number are incorrectly parsed as JSON
bug processors
It looks as though a message of the form `<raw JSON value><some other stuff>` is incorrectly parsed as the JSON contents `<raw JSON value>`, so the raw value `123foo` would be parsed as `123`. This is a problem when using processors that attempt to extract a structured form from messages (such as the `branch` processor), as it'll result in the message being mutated.
1.0
Raw messages beginning with a number are incorrectly parsed as JSON - It looks as though a message of the form `<raw JSON value><some other stuff>` is incorrectly parsed as the JSON contents `<raw JSON value>`, so the raw value `123foo` would be parsed as `123`. This is a problem when using processors that attempt to extract a structured form from messages (such as the `branch` processor), as it'll result in the message being mutated.
process
raw messages beginning with a number are incorrectly parsed as json it looks as though a message of the form is incorrectly parsed as the json contents so the raw value would be parsed as this is a problem when using processors that attempt to extract a structured form from messages such as the branch processor as it ll result in the message being mutated
1
133,520
18,894,776,386
IssuesEvent
2021-11-15 16:40:55
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
closed
Document Established Form Patterns: Release Plan
vsp-design-system-team
## Guidance _This Release Plan Template is intended to help ensure your new VSP product or feature is ready for launch and will deliver the desired user outcomes._ * _In Phase 1, you will test your product to learn how it'll actually be used, what problems it might create, and then fix/adjust if necessary prior to going live._ * _In Phase 2, you will define the launch communications plan and post-launch KPIs. **You can close this ticket once you have completed Phase 2**._ * _Phase 3 is optional - you may use it to document post-launch results._ _Fill out the **Planning** sections to start, then the **Results** sections as you complete each phase of testing._ --- ## Phase I: User Acceptance Testing Form patterns documentation will be reviewed by Kevin Hoffman and Carol Wong ### Planning: - Desired date range or test duration: each form pattern document will be reviewed over during the sprint - Desired number of users: n/a - The group(s) from which you'll recruit test users: VFS/VSP/DEPO - How you'll recruit test users: n/a - How you'll give test users access to the product: n/a - What "Success" criteria will you look at before launch? By the end of Q1 2021 there will be 10 form patterns documented. - How you'll measure your "Success" criteria? 10 form patterns documents have been added to the design system. ### Results: - Number of users: x - Results per your "Success" criteria: - Number of bugs identified / fixed: x/x - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: lorem ipsum ## Phase II: Go Live! ### Planning: - Desired launch date: - Post-launch KPI 1: - Post-launch KPI 2: - Post-launch KPI 3: ## Phase 3: Post-launch Results [Optional] ### 1-week results: - Number of unique users: - Post-launch KPI 1 actual: - Post-launch KPI 2 actual: - Post-launch KPI 3 actual: - Any issues with VSP handling/processing?: yes/no - If yes, what: - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: ### 1-month results: - Number of unique users: - Post-launch KPI 1 actual: - Post-launch KPI 2 actual: - Post-launch KPI 3 actual: - Any issues with VSP handling/processing?: yes/no - If yes, what: - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: ## Post-launch Questions *To be completed once you have gathered your initial set of data, as outlined above.* 1. How do the KPIs you gathered compare to your pre-launch definition(s) of "success"? 2. What qualitative feedback have you gathered from users or other stakeholders, if any? 3. Which of the assumptions you listed in your product outline were/were not validated? 4. How might your product evolve now or in the future based on these results?
1.0
Document Established Form Patterns: Release Plan - ## Guidance _This Release Plan Template is intended to help ensure your new VSP product or feature is ready for launch and will deliver the desired user outcomes._ * _In Phase 1, you will test your product to learn how it'll actually be used, what problems it might create, and then fix/adjust if necessary prior to going live._ * _In Phase 2, you will define the launch communications plan and post-launch KPIs. **You can close this ticket once you have completed Phase 2**._ * _Phase 3 is optional - you may use it to document post-launch results._ _Fill out the **Planning** sections to start, then the **Results** sections as you complete each phase of testing._ --- ## Phase I: User Acceptance Testing Form patterns documentation will be reviewed by Kevin Hoffman and Carol Wong ### Planning: - Desired date range or test duration: each form pattern document will be reviewed over during the sprint - Desired number of users: n/a - The group(s) from which you'll recruit test users: VFS/VSP/DEPO - How you'll recruit test users: n/a - How you'll give test users access to the product: n/a - What "Success" criteria will you look at before launch? By the end of Q1 2021 there will be 10 form patterns documented. - How you'll measure your "Success" criteria? 10 form patterns documents have been added to the design system. ### Results: - Number of users: x - Results per your "Success" criteria: - Number of bugs identified / fixed: x/x - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: lorem ipsum ## Phase II: Go Live! ### Planning: - Desired launch date: - Post-launch KPI 1: - Post-launch KPI 2: - Post-launch KPI 3: ## Phase 3: Post-launch Results [Optional] ### 1-week results: - Number of unique users: - Post-launch KPI 1 actual: - Post-launch KPI 2 actual: - Post-launch KPI 3 actual: - Any issues with VSP handling/processing?: yes/no - If yes, what: - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: ### 1-month results: - Number of unique users: - Post-launch KPI 1 actual: - Post-launch KPI 2 actual: - Post-launch KPI 3 actual: - Any issues with VSP handling/processing?: yes/no - If yes, what: - Any UX changes necessary based on the logs, user feedback, or VA challenges? yes/no - If yes, what: ## Post-launch Questions *To be completed once you have gathered your initial set of data, as outlined above.* 1. How do the KPIs you gathered compare to your pre-launch definition(s) of "success"? 2. What qualitative feedback have you gathered from users or other stakeholders, if any? 3. Which of the assumptions you listed in your product outline were/were not validated? 4. How might your product evolve now or in the future based on these results?
non_process
document established form patterns release plan guidance this release plan template is intended to help ensure your new vsp product or feature is ready for launch and will deliver the desired user outcomes in phase you will test your product to learn how it ll actually be used what problems it might create and then fix adjust if necessary prior to going live in phase you will define the launch communications plan and post launch kpis you can close this ticket once you have completed phase phase is optional you may use it to document post launch results fill out the planning sections to start then the results sections as you complete each phase of testing phase i user acceptance testing form patterns documentation will be reviewed by kevin hoffman and carol wong planning desired date range or test duration each form pattern document will be reviewed over during the sprint desired number of users n a the group s from which you ll recruit test users vfs vsp depo how you ll recruit test users n a how you ll give test users access to the product n a what success criteria will you look at before launch by the end of there will be form patterns documented how you ll measure your success criteria form patterns documents have been added to the design system results number of users x results per your success criteria number of bugs identified fixed x x any ux changes necessary based on the logs user feedback or va challenges yes no if yes what lorem ipsum phase ii go live planning desired launch date post launch kpi post launch kpi post launch kpi phase post launch results week results number of unique users post launch kpi actual post launch kpi actual post launch kpi actual any issues with vsp handling processing yes no if yes what any ux changes necessary based on the logs user feedback or va challenges yes no if yes what month results number of unique users post launch kpi actual post launch kpi actual post launch kpi actual any issues with vsp handling processing yes no if yes what any ux changes necessary based on the logs user feedback or va challenges yes no if yes what post launch questions to be completed once you have gathered your initial set of data as outlined above how do the kpis you gathered compare to your pre launch definition s of success what qualitative feedback have you gathered from users or other stakeholders if any which of the assumptions you listed in your product outline were were not validated how might your product evolve now or in the future based on these results
0
3,469
6,550,965,631
IssuesEvent
2017-09-05 13:14:39
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Regulation terms request for 'GO:0043131 erythrocyte enucleation'
cellular processes in progress
Hi , Could I please ask for regulation terms ( particularly the positive one) for 'GO:0043131 erythrocyte enucleation' based on PMID: 25241935? In Figure 5, the authors shown that Trim58 regulates nuclear extrusion during erythropoiesis. Suggested definition: Any process that modulates the frequency, rate or extent of nucleated precursor cells losing their nucleus during erythrocyte maturation. Thanks, Penelope
1.0
Regulation terms request for 'GO:0043131 erythrocyte enucleation' - Hi , Could I please ask for regulation terms ( particularly the positive one) for 'GO:0043131 erythrocyte enucleation' based on PMID: 25241935? In Figure 5, the authors shown that Trim58 regulates nuclear extrusion during erythropoiesis. Suggested definition: Any process that modulates the frequency, rate or extent of nucleated precursor cells losing their nucleus during erythrocyte maturation. Thanks, Penelope
process
regulation terms request for go erythrocyte enucleation hi could i please ask for regulation terms particularly the positive one for go erythrocyte enucleation based on pmid in figure the authors shown that regulates nuclear extrusion during erythropoiesis suggested definition any process that modulates the frequency rate or extent of nucleated precursor cells losing their nucleus during erythrocyte maturation thanks penelope
1
171,024
13,213,105,091
IssuesEvent
2020-08-16 11:03:04
yobnytech/queueone_issues_tracker
https://api.github.com/repos/yobnytech/queueone_issues_tracker
closed
Search in partner app causes unexpected changes in Token view
Fixed coordinator_android_app test and close
I added 3 tokens, searched using phone no of one of them, it filters that token .. but it creates multiple copies of Token it searched ![Screenshot_20200815-233909](https://user-images.githubusercontent.com/61919696/90318970-c6455900-df51-11ea-94b2-5bfa3d6a3d62.jpg)
1.0
Search in partner app causes unexpected changes in Token view - I added 3 tokens, searched using phone no of one of them, it filters that token .. but it creates multiple copies of Token it searched ![Screenshot_20200815-233909](https://user-images.githubusercontent.com/61919696/90318970-c6455900-df51-11ea-94b2-5bfa3d6a3d62.jpg)
non_process
search in partner app causes unexpected changes in token view i added tokens searched using phone no of one of them it filters that token but it creates multiple copies of token it searched
0
17,248
23,033,049,797
IssuesEvent
2022-07-22 15:38:40
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
[processor/resourcedetection] instance hostname missing on GKE with new detector
bug comp:google processor/resourcedetection
**Describe the bug** Before #10347, someone using the `gke` and `gce` detectors would have the instance hostname on GKE. After this change, the new `gcp` detector would not add this attribute. This is a regression on v0.55.0, and also makes it difficult for people to transition to the new detector. Datadog needs this attribute in order to correctly correlate telemetry reported by the Collector with telemetry reported by the Datadog GCP integration, which uses this as part of the hostname. **Steps to reproduce** Add the `gke` and `gce` detectors to your Collector on a GKE cluster. **What did you expect to see?** The `host.name` attribute would be present as it happened on v0.54.0 and below **What did you see instead?** The `host.name` attribute is missing. **What version did you use?** v0.55.0 **What config did you use?** ``` receivers: hostmetrics: scrapers: load: collection_interval: 10s processors: resourcedetection: detectors: [gke, gce] exporters: logging: loglevel: debug service: pipelines: metrics: receivers: [hostmetrics] processors: [resourcedetection] exporters: [logging] ``` **Environment** v0.55.0 Docker image
1.0
[processor/resourcedetection] instance hostname missing on GKE with new detector - **Describe the bug** Before #10347, someone using the `gke` and `gce` detectors would have the instance hostname on GKE. After this change, the new `gcp` detector would not add this attribute. This is a regression on v0.55.0, and also makes it difficult for people to transition to the new detector. Datadog needs this attribute in order to correctly correlate telemetry reported by the Collector with telemetry reported by the Datadog GCP integration, which uses this as part of the hostname. **Steps to reproduce** Add the `gke` and `gce` detectors to your Collector on a GKE cluster. **What did you expect to see?** The `host.name` attribute would be present as it happened on v0.54.0 and below **What did you see instead?** The `host.name` attribute is missing. **What version did you use?** v0.55.0 **What config did you use?** ``` receivers: hostmetrics: scrapers: load: collection_interval: 10s processors: resourcedetection: detectors: [gke, gce] exporters: logging: loglevel: debug service: pipelines: metrics: receivers: [hostmetrics] processors: [resourcedetection] exporters: [logging] ``` **Environment** v0.55.0 Docker image
process
instance hostname missing on gke with new detector describe the bug before someone using the gke and gce detectors would have the instance hostname on gke after this change the new gcp detector would not add this attribute this is a regression on and also makes it difficult for people to transition to the new detector datadog needs this attribute in order to correctly correlate telemetry reported by the collector with telemetry reported by the datadog gcp integration which uses this as part of the hostname steps to reproduce add the gke and gce detectors to your collector on a gke cluster what did you expect to see the host name attribute would be present as it happened on and below what did you see instead the host name attribute is missing what version did you use what config did you use receivers hostmetrics scrapers load collection interval processors resourcedetection detectors exporters logging loglevel debug service pipelines metrics receivers processors exporters environment docker image
1
213,057
7,245,356,961
IssuesEvent
2018-02-14 17:49:02
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Cellular component assembly/disassembly - add missing logical definitions
Other term-related request logical definitions low priority time unknown (external dependencies etc)
Though logical definitions (aka ‘cross-products’) have been added broadly to the ontology, a few stray terms have escaped the effort. Making a note here as my task to look at the branches under ‘cellular component assembly’ and ‘cellular component disassembly’ and add the missing defs. (Note for self: previous work was recorded in a Jira ticket https://www.ebi.ac.uk/panda/jira/browse/GO-178 now closed, but may wish to look into its more recent comments.)
1.0
Cellular component assembly/disassembly - add missing logical definitions - Though logical definitions (aka ‘cross-products’) have been added broadly to the ontology, a few stray terms have escaped the effort. Making a note here as my task to look at the branches under ‘cellular component assembly’ and ‘cellular component disassembly’ and add the missing defs. (Note for self: previous work was recorded in a Jira ticket https://www.ebi.ac.uk/panda/jira/browse/GO-178 now closed, but may wish to look into its more recent comments.)
non_process
cellular component assembly disassembly add missing logical definitions though logical definitions aka ‘cross products’ have been added broadly to the ontology a few stray terms have escaped the effort making a note here as my task to look at the branches under ‘cellular component assembly’ and ‘cellular component disassembly’ and add the missing defs note for self previous work was recorded in a jira ticket now closed but may wish to look into its more recent comments
0
5,812
8,649,338,667
IssuesEvent
2018-11-26 19:06:53
googleapis/google-cloud-python
https://api.github.com/repos/googleapis/google-cloud-python
closed
[Firestore] SDK doesn’t add the google-cloud-resource header.
api: firestore triaged for GA type: process
The Python SDK doesn’t add the google-cloud-resource header.
1.0
[Firestore] SDK doesn’t add the google-cloud-resource header. - The Python SDK doesn’t add the google-cloud-resource header.
process
sdk doesn’t add the google cloud resource header the python sdk doesn’t add the google cloud resource header
1
313,737
23,489,853,843
IssuesEvent
2022-08-17 17:35:39
rust-lang/crates.io
https://api.github.com/repos/rust-lang/crates.io
closed
Is the crates.io policies page outdated when it refers to "Mozilla Legal"?
C-documentation
The [crates.io policies page](https://crates.io/policies) says: > For issues such as DMCA violations, trademark and copyright infringement, Crates.io will respect Mozilla Legal’s decisions with regards to content that is hosted. Should this refer to the Rust Foundation instead? The Foundation [FAQ](https://github.com/rust-lang/foundation-faq-2020/blob/main/FAQ.md) states that legal advice *"such as on how to handle DMCAs for the crates.io team"* is in-scope.
1.0
Is the crates.io policies page outdated when it refers to "Mozilla Legal"? - The [crates.io policies page](https://crates.io/policies) says: > For issues such as DMCA violations, trademark and copyright infringement, Crates.io will respect Mozilla Legal’s decisions with regards to content that is hosted. Should this refer to the Rust Foundation instead? The Foundation [FAQ](https://github.com/rust-lang/foundation-faq-2020/blob/main/FAQ.md) states that legal advice *"such as on how to handle DMCAs for the crates.io team"* is in-scope.
non_process
is the crates io policies page outdated when it refers to mozilla legal the says for issues such as dmca violations trademark and copyright infringement crates io will respect mozilla legal’s decisions with regards to content that is hosted should this refer to the rust foundation instead the foundation states that legal advice such as on how to handle dmcas for the crates io team is in scope
0
625,892
19,769,802,648
IssuesEvent
2022-01-17 08:52:33
merico-dev/lake
https://api.github.com/repos/merico-dev/lake
closed
Redesign config-UI for 'trigger collection'
proposal priority: medium
## Description Users can easily define the scope of data to collect by choosing jira board/gitlab projects/github projects instead of operating Json ## Describe the solution you'd like - [ ] config-ui design - [ ] config-ui implementation
1.0
Redesign config-UI for 'trigger collection' - ## Description Users can easily define the scope of data to collect by choosing jira board/gitlab projects/github projects instead of operating Json ## Describe the solution you'd like - [ ] config-ui design - [ ] config-ui implementation
non_process
redesign config ui for trigger collection description users can easily define the scope of data to collect by choosing jira board gitlab projects github projects instead of operating json describe the solution you d like config ui design config ui implementation
0
29,044
8,269,027,442
IssuesEvent
2018-09-15 00:32:17
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
closed
Can't npm install. Outdated version of Babel Core.
Build Tooling
**Description** When running npm install i get the following error: `STATUS: Installing and updating NPM packages... npm ERR! code E404 npm ERR! 404 Not Found: @babel/runtime-corejs2@7.0.0-beta.56` It appears that this beta version of babel core is no longer available, as the latest is 7.0.0-rc.1 (https://github.com/babel/babel/releases) **To Reproduce** Steps to reproduce the behavior: Run ./bin/setup-local-env.sh **Expected behavior** Successful npm install with no errors. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - OS: Mac OS - Browser: Chome - Version: 67
1.0
Can't npm install. Outdated version of Babel Core. - **Description** When running npm install i get the following error: `STATUS: Installing and updating NPM packages... npm ERR! code E404 npm ERR! 404 Not Found: @babel/runtime-corejs2@7.0.0-beta.56` It appears that this beta version of babel core is no longer available, as the latest is 7.0.0-rc.1 (https://github.com/babel/babel/releases) **To Reproduce** Steps to reproduce the behavior: Run ./bin/setup-local-env.sh **Expected behavior** Successful npm install with no errors. **Screenshots** If applicable, add screenshots to help explain your problem. **Desktop (please complete the following information):** - OS: Mac OS - Browser: Chome - Version: 67
non_process
can t npm install outdated version of babel core description when running npm install i get the following error status installing and updating npm packages npm err code npm err not found babel runtime beta it appears that this beta version of babel core is no longer available as the latest is rc to reproduce steps to reproduce the behavior run bin setup local env sh expected behavior successful npm install with no errors screenshots if applicable add screenshots to help explain your problem desktop please complete the following information os mac os browser chome version
0
5,464
8,328,330,549
IssuesEvent
2018-09-27 00:11:00
ArctosDB/new-collections
https://api.github.com/repos/ArctosDB/new-collections
closed
Ohio Wesleyan University - Notify Arctos Working Group
Application in process MOU draft needed
Forward questionnaire to [Arctos Working Group](arctos-working-group@googlegroups.com) and request volunteers for collection mentor. AWG member can volunteer to act as primary contact, especially if they have similar collections or specific knowledge about a collection; can serve as ‘in kind support’ for collections to help offset costs
1.0
Ohio Wesleyan University - Notify Arctos Working Group - Forward questionnaire to [Arctos Working Group](arctos-working-group@googlegroups.com) and request volunteers for collection mentor. AWG member can volunteer to act as primary contact, especially if they have similar collections or specific knowledge about a collection; can serve as ‘in kind support’ for collections to help offset costs
process
ohio wesleyan university notify arctos working group forward questionnaire to arctos working group googlegroups com and request volunteers for collection mentor awg member can volunteer to act as primary contact especially if they have similar collections or specific knowledge about a collection can serve as ‘in kind support’ for collections to help offset costs
1
4,118
7,059,474,908
IssuesEvent
2018-01-05 01:52:53
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Set windowsHide flag when call child_process.fork in cluster.fork
child_process feature request windows
I'm not very sure about this, actually i'm a newbie to node's cluster, but it doesn't make a lot of sense to me to show many console windows when build a cluster. For a real world scenario, I'm using pm2 with the --instances argument. It opens too many black console windows, which really annoying.
1.0
Set windowsHide flag when call child_process.fork in cluster.fork - I'm not very sure about this, actually i'm a newbie to node's cluster, but it doesn't make a lot of sense to me to show many console windows when build a cluster. For a real world scenario, I'm using pm2 with the --instances argument. It opens too many black console windows, which really annoying.
process
set windowshide flag when call child process fork in cluster fork i m not very sure about this actually i m a newbie to node s cluster but it doesn t make a lot of sense to me to show many console windows when build a cluster for a real world scenario i m using with the instances argument it opens too many black console windows which really annoying
1
21,295
28,494,236,422
IssuesEvent
2023-04-18 13:15:06
GIScience/sketch-map-tool
https://api.github.com/repos/GIScience/sketch-map-tool
closed
Preserve uploaded file name in result raster files
component:upload-processing
Currently after uploading photos/scans of a Sketch Map the raster result will be a zip with all the cropped map frames as GeoTIFF's with numbers as file names. E.g. `0.geotiff`, `1.geotiff`. Expected: Preserve file name of uploaded files to be able to relate raster result files to uploaded files.
1.0
Preserve uploaded file name in result raster files - Currently after uploading photos/scans of a Sketch Map the raster result will be a zip with all the cropped map frames as GeoTIFF's with numbers as file names. E.g. `0.geotiff`, `1.geotiff`. Expected: Preserve file name of uploaded files to be able to relate raster result files to uploaded files.
process
preserve uploaded file name in result raster files currently after uploading photos scans of a sketch map the raster result will be a zip with all the cropped map frames as geotiff s with numbers as file names e g geotiff geotiff expected preserve file name of uploaded files to be able to relate raster result files to uploaded files
1
112,175
17,080,011,530
IssuesEvent
2021-07-08 02:48:13
faizulho/gatsby-starter-docz-netlifycms-1
https://api.github.com/repos/faizulho/gatsby-starter-docz-netlifycms-1
opened
CVE-2020-7656 (Medium) detected in jquery-1.7.2.min.js
security vulnerability
## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/node_modules/marked/www/demo.html</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/marked/www/demo.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/faizulho/gatsby-starter-docz-netlifycms-1/commit/70a9e87b1e68c0bef6964284e0899376209b0f3d">70a9e87b1e68c0bef6964284e0899376209b0f3d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-28</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7656 (Medium) detected in jquery-1.7.2.min.js - ## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/node_modules/marked/www/demo.html</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/marked/www/demo.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/faizulho/gatsby-starter-docz-netlifycms-1/commit/70a9e87b1e68c0bef6964284e0899376209b0f3d">70a9e87b1e68c0bef6964284e0899376209b0f3d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-28</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file gatsby starter docz netlifycms node modules marked www demo html path to vulnerable library gatsby starter docz netlifycms node modules marked www demo html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
0
6,251
3,354,850,262
IssuesEvent
2015-11-18 14:16:01
IQSS/dataverse
https://api.github.com/repos/IQSS/dataverse
opened
Improve automated testing
Component: Code Infrastructure Status: Dev
We'd like to make improvements to our automated testing. I'm going to iterate on this issue description to jot down some ideas which may spawn child issues. - Unit Tests - [ ] Increase code coverage. As of Dataverse 4.2.1 code coverage was 4.54%: https://coveralls.io/builds/4021183 - [ ] Review pull requests from @bencomp for ideas for approaches to testing: https://github.com/IQSS/dataverse/pulls?q=is%3Apr+author%3Abencomp - [ ] Come up with a way to test commands: http://irclog.iq.harvard.edu/dataverse/2015-11-04#i_26750 - Integration Tests - API testing - [ ] Reinstate periodic (at least nightly) runs of integration tests a la https://build.hmdc.harvard.edu:8443/job/apitest.dataverse.org-apitester/ - [ ] Attempt to use @openscholar approach for running integration tests using Travis https://github.com/openscholar/openscholar/blob/SCHOLAR-3.x/.travis.yml - [ ] Work with @leeper on testing the R client: https://github.com/IQSS/dataverse-client-r - [ ] Work with @CenterForOpenScience on testing the Python client: https://github.com/IQSS/dataverse-client-python/issues/10 - Automated browser testing - [ ] Revisit Selenium/Open Sauce: https://github.com/IQSS/dataverse/commit/8a26404 - Documentation - [ ] Update http://guides.dataverse.org/en/latest/developers/testing.html to be more clear about how much automated testing is expected from developers.
1.0
Improve automated testing - We'd like to make improvements to our automated testing. I'm going to iterate on this issue description to jot down some ideas which may spawn child issues. - Unit Tests - [ ] Increase code coverage. As of Dataverse 4.2.1 code coverage was 4.54%: https://coveralls.io/builds/4021183 - [ ] Review pull requests from @bencomp for ideas for approaches to testing: https://github.com/IQSS/dataverse/pulls?q=is%3Apr+author%3Abencomp - [ ] Come up with a way to test commands: http://irclog.iq.harvard.edu/dataverse/2015-11-04#i_26750 - Integration Tests - API testing - [ ] Reinstate periodic (at least nightly) runs of integration tests a la https://build.hmdc.harvard.edu:8443/job/apitest.dataverse.org-apitester/ - [ ] Attempt to use @openscholar approach for running integration tests using Travis https://github.com/openscholar/openscholar/blob/SCHOLAR-3.x/.travis.yml - [ ] Work with @leeper on testing the R client: https://github.com/IQSS/dataverse-client-r - [ ] Work with @CenterForOpenScience on testing the Python client: https://github.com/IQSS/dataverse-client-python/issues/10 - Automated browser testing - [ ] Revisit Selenium/Open Sauce: https://github.com/IQSS/dataverse/commit/8a26404 - Documentation - [ ] Update http://guides.dataverse.org/en/latest/developers/testing.html to be more clear about how much automated testing is expected from developers.
non_process
improve automated testing we d like to make improvements to our automated testing i m going to iterate on this issue description to jot down some ideas which may spawn child issues unit tests increase code coverage as of dataverse code coverage was review pull requests from bencomp for ideas for approaches to testing come up with a way to test commands integration tests api testing reinstate periodic at least nightly runs of integration tests a la attempt to use openscholar approach for running integration tests using travis work with leeper on testing the r client work with centerforopenscience on testing the python client automated browser testing revisit selenium open sauce documentation update to be more clear about how much automated testing is expected from developers
0
250,664
18,901,963,085
IssuesEvent
2021-11-16 02:43:37
PeernetOfficial/Cmd
https://api.github.com/repos/PeernetOfficial/Cmd
closed
Test with non-admin account on Windows
documentation
Test on Windows with non-admin account and document results.
1.0
Test with non-admin account on Windows - Test on Windows with non-admin account and document results.
non_process
test with non admin account on windows test on windows with non admin account and document results
0
13,938
16,706,448,721
IssuesEvent
2021-06-09 10:32:15
paul-buerkner/brms
https://api.github.com/repos/paul-buerkner/brms
closed
brms' GAMs smooth term not supported by emmeans
feature good first issue post-processing
Hi Paul, Opening this issue following Russell's suggestion (https://github.com/rvlenth/emmeans/issues/198#issuecomment-632844645), connected to https://github.com/paul-buerkner/brms/issues/418 Essentially, the smooth term from brms models is not "extracted" (?) correctly, as it does for instance with mgcv: ``` r model <- mgcv::gam(Sepal.Length ~ s(Petal.Length), data=iris) emmeans::ref_grid(model, at = list("Petal.Length"=c(2, 3, 4))) #> 'emmGrid' object with variables: #> Petal.Length = 2, 3, 4 model <- brms::brm(Sepal.Length ~ s(Petal.Length), data=iris, chains=2, iter=200, refresh=0) emmeans::ref_grid(model, at = list("Petal.Length"=c(2, 3, 4))) #> 'emmGrid' object with variables: #> 1 = 1 ``` Russell mentioned a potential resolution: > if it's like the gam method, it just needs an appropriate call to a predict method (see emmeans:::emm_basis.gam) Thanks!
1.0
brms' GAMs smooth term not supported by emmeans - Hi Paul, Opening this issue following Russell's suggestion (https://github.com/rvlenth/emmeans/issues/198#issuecomment-632844645), connected to https://github.com/paul-buerkner/brms/issues/418 Essentially, the smooth term from brms models is not "extracted" (?) correctly, as it does for instance with mgcv: ``` r model <- mgcv::gam(Sepal.Length ~ s(Petal.Length), data=iris) emmeans::ref_grid(model, at = list("Petal.Length"=c(2, 3, 4))) #> 'emmGrid' object with variables: #> Petal.Length = 2, 3, 4 model <- brms::brm(Sepal.Length ~ s(Petal.Length), data=iris, chains=2, iter=200, refresh=0) emmeans::ref_grid(model, at = list("Petal.Length"=c(2, 3, 4))) #> 'emmGrid' object with variables: #> 1 = 1 ``` Russell mentioned a potential resolution: > if it's like the gam method, it just needs an appropriate call to a predict method (see emmeans:::emm_basis.gam) Thanks!
process
brms gams smooth term not supported by emmeans hi paul opening this issue following russell s suggestion connected to essentially the smooth term from brms models is not extracted correctly as it does for instance with mgcv r model mgcv gam sepal length s petal length data iris emmeans ref grid model at list petal length c emmgrid object with variables petal length model brms brm sepal length s petal length data iris chains iter refresh emmeans ref grid model at list petal length c emmgrid object with variables russell mentioned a potential resolution if it s like the gam method it just needs an appropriate call to a predict method see emmeans emm basis gam thanks
1
6,930
10,095,830,419
IssuesEvent
2019-07-27 12:44:47
shirou/gopsutil
https://api.github.com/repos/shirou/gopsutil
closed
process.PidExists() is inefficient, should only check given PID
package:process
Instead of [listing every processes and iterating over each of them](https://github.com/shirou/gopsutil/blob/48177ef5f8809fc72b716c414435f2d4cff8e24d/process/process.go#L134-L147) it should use the OS capabilities to only check the given pid: * Linux: check if `/proc/$PID` exists * BSDs/darwin: check the result of `ps -o pid= -p $PID` * [Windows](https://stackoverflow.com/a/592788): OpenProcess + GetExitCodeProcess * or Linux+BSDs+Darwin: [try `kill -s 0 $PID`, and check the error returned](https://github.com/giampaolo/psutil/blob/a3d6a28be2631cae7f78287b0742bba36338a745/psutil/_psposix.py#L24-L49) This might let us systematically check process existence at the beginning of every process function like Python psutil does/when an error is encountered, for better error messages. As discussed in https://github.com/shirou/gopsutil/pull/605#issuecomment-437794162 _Note: depending of the implementation, it might still be more efficient to keep the current "get all running processes and iterate over them" approach to check for PID existence, this should always be benchmarked._
1.0
process.PidExists() is inefficient, should only check given PID - Instead of [listing every processes and iterating over each of them](https://github.com/shirou/gopsutil/blob/48177ef5f8809fc72b716c414435f2d4cff8e24d/process/process.go#L134-L147) it should use the OS capabilities to only check the given pid: * Linux: check if `/proc/$PID` exists * BSDs/darwin: check the result of `ps -o pid= -p $PID` * [Windows](https://stackoverflow.com/a/592788): OpenProcess + GetExitCodeProcess * or Linux+BSDs+Darwin: [try `kill -s 0 $PID`, and check the error returned](https://github.com/giampaolo/psutil/blob/a3d6a28be2631cae7f78287b0742bba36338a745/psutil/_psposix.py#L24-L49) This might let us systematically check process existence at the beginning of every process function like Python psutil does/when an error is encountered, for better error messages. As discussed in https://github.com/shirou/gopsutil/pull/605#issuecomment-437794162 _Note: depending of the implementation, it might still be more efficient to keep the current "get all running processes and iterate over them" approach to check for PID existence, this should always be benchmarked._
process
process pidexists is inefficient should only check given pid instead of it should use the os capabilities to only check the given pid linux check if proc pid exists bsds darwin check the result of ps o pid p pid openprocess getexitcodeprocess or linux bsds darwin this might let us systematically check process existence at the beginning of every process function like python psutil does when an error is encountered for better error messages as discussed in note depending of the implementation it might still be more efficient to keep the current get all running processes and iterate over them approach to check for pid existence this should always be benchmarked
1
16,954
22,307,749,137
IssuesEvent
2022-06-13 14:23:02
corona-warn-app/cwa-wishlist
https://api.github.com/repos/corona-warn-app/cwa-wishlist
closed
Add test result of my children
mirrored-to-jira Test/Share process ready-to-close Fix 2.21
<!-- Thanks for requesting a feature 🙌 ❤️ Before opening a new issue, please make sure that we do not have any duplicates already open. You can ensure this by searching the issue list for this repository. If there is a duplicate, please close your issue and add a comment to the existing issue instead. --> ## Feature description As a parent I would like to add the test result of my children. If my child had a positive result I would became a person with a higher risk and I would like to warn people I have met. <!--- Provide a detailed description of the feature or improvement you are proposing. What specific solution would you like? What is the expected behaviour? Add any other context, screenshots, or code snippets about the feature request here as well. --> Test if children so not have a qr cose, there it it not possible to add it to the app. It is most likely that if my children have a positive test result that I will have it , too. But that might be just minimum two days later. ## Problem and motivation <!--- Why is this change important to you? What is the problem this feature would solve? How would you use it? How can it benefit other users? --> So in order to break the chain this could help to inform people a lot faster. And reduce the spreading. ## Is this something you're interested in working on <!--- Yes or No --> No, I have not the skill to add it to the app, unfortunately. Related issues: --- #109 --- Internal Tracking ID: [EXPOSUREAPP-2068](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-2068)
1.0
Add test result of my children - <!-- Thanks for requesting a feature 🙌 ❤️ Before opening a new issue, please make sure that we do not have any duplicates already open. You can ensure this by searching the issue list for this repository. If there is a duplicate, please close your issue and add a comment to the existing issue instead. --> ## Feature description As a parent I would like to add the test result of my children. If my child had a positive result I would became a person with a higher risk and I would like to warn people I have met. <!--- Provide a detailed description of the feature or improvement you are proposing. What specific solution would you like? What is the expected behaviour? Add any other context, screenshots, or code snippets about the feature request here as well. --> Test if children so not have a qr cose, there it it not possible to add it to the app. It is most likely that if my children have a positive test result that I will have it , too. But that might be just minimum two days later. ## Problem and motivation <!--- Why is this change important to you? What is the problem this feature would solve? How would you use it? How can it benefit other users? --> So in order to break the chain this could help to inform people a lot faster. And reduce the spreading. ## Is this something you're interested in working on <!--- Yes or No --> No, I have not the skill to add it to the app, unfortunately. Related issues: --- #109 --- Internal Tracking ID: [EXPOSUREAPP-2068](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-2068)
process
add test result of my children thanks for requesting a feature 🙌 ❤️ before opening a new issue please make sure that we do not have any duplicates already open you can ensure this by searching the issue list for this repository if there is a duplicate please close your issue and add a comment to the existing issue instead feature description as a parent i would like to add the test result of my children if my child had a positive result i would became a person with a higher risk and i would like to warn people i have met provide a detailed description of the feature or improvement you are proposing what specific solution would you like what is the expected behaviour add any other context screenshots or code snippets about the feature request here as well test if children so not have a qr cose there it it not possible to add it to the app it is most likely that if my children have a positive test result that i will have it too but that might be just minimum two days later problem and motivation why is this change important to you what is the problem this feature would solve how would you use it how can it benefit other users so in order to break the chain this could help to inform people a lot faster and reduce the spreading is this something you re interested in working on no i have not the skill to add it to the app unfortunately related issues internal tracking id
1
31,878
2,740,796,128
IssuesEvent
2015-04-21 06:21:14
creativedisturbance/public-website
https://api.github.com/repos/creativedisturbance/public-website
closed
RSS Feed for Specific Channels
Disturbances High Priority
Try to do this in LibSyn. Benefits us because we can push to iTunes. We need feeds for both individual channels and all Creative Disturbance podcasts.
1.0
RSS Feed for Specific Channels - Try to do this in LibSyn. Benefits us because we can push to iTunes. We need feeds for both individual channels and all Creative Disturbance podcasts.
non_process
rss feed for specific channels try to do this in libsyn benefits us because we can push to itunes we need feeds for both individual channels and all creative disturbance podcasts
0
246,839
20,919,996,146
IssuesEvent
2022-03-24 16:30:53
microsoft/playwright
https://api.github.com/repos/microsoft/playwright
closed
[Feature] Add option to ignoreHTTPSErrors in webServer
feature-test-runner v1.22
I noticed that playwright cannot correctly specify if the server is running or not and starts tests too fast when there are problems with https. When I have added check for URL and port using opening new window and ignoring HTTPS errors it works fine, so adding an option to ignoreHTTPSErrors in webserver config would do the job.
1.0
[Feature] Add option to ignoreHTTPSErrors in webServer - I noticed that playwright cannot correctly specify if the server is running or not and starts tests too fast when there are problems with https. When I have added check for URL and port using opening new window and ignoring HTTPS errors it works fine, so adding an option to ignoreHTTPSErrors in webserver config would do the job.
non_process
add option to ignorehttpserrors in webserver i noticed that playwright cannot correctly specify if the server is running or not and starts tests too fast when there are problems with https when i have added check for url and port using opening new window and ignoring https errors it works fine so adding an option to ignorehttpserrors in webserver config would do the job
0
3,141
6,195,002,652
IssuesEvent
2017-07-05 11:25:38
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
GO:0000963 mitochondrial RNA processing parent
matrix PomBase RNA processes
based on this https://github.com/geneontology/go-ontology/issues/11677 presumably GO:0000963 mitochondrial RNA processing should also not be part_of GO:0007005 mitochondrion organization
1.0
GO:0000963 mitochondrial RNA processing parent - based on this https://github.com/geneontology/go-ontology/issues/11677 presumably GO:0000963 mitochondrial RNA processing should also not be part_of GO:0007005 mitochondrion organization
process
go mitochondrial rna processing parent based on this presumably go mitochondrial rna processing should also not be part of go mitochondrion organization
1
15,269
19,249,976,774
IssuesEvent
2021-12-09 03:13:12
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
opened
DISABLED test_fs_pool (__main__.TestMultiprocessing)
module: multiprocessing module: flaky-tests
This test has been determined flaky through reruns in CI and its instances are reported in our flaky_tests table here https://metrics.pytorch.org/d/L0r6ErGnk/github-status?orgId=1&from=1636426818307&to=1639018818307&viewPanel=57. ``` ====================================================================== FAIL [5.016s]: test_fs_pool (__main__.TestMultiprocessing) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_multiprocessing.py", line 355, in test_fs_pool self._test_pool(repeat=TEST_REPEATS) File "test_multiprocessing.py", line 327, in _test_pool do_test() File "test_multiprocessing.py", line 206, in __exit__ self.test_case.assertFalse(self.has_shm_files()) AssertionError: True is not false ``` Please look at the table for details from the past 30 days such as * number of failed instances * an example url * which platforms it failed on * the number of times it failed on trunk vs on PRs.
1.0
DISABLED test_fs_pool (__main__.TestMultiprocessing) - This test has been determined flaky through reruns in CI and its instances are reported in our flaky_tests table here https://metrics.pytorch.org/d/L0r6ErGnk/github-status?orgId=1&from=1636426818307&to=1639018818307&viewPanel=57. ``` ====================================================================== FAIL [5.016s]: test_fs_pool (__main__.TestMultiprocessing) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_multiprocessing.py", line 355, in test_fs_pool self._test_pool(repeat=TEST_REPEATS) File "test_multiprocessing.py", line 327, in _test_pool do_test() File "test_multiprocessing.py", line 206, in __exit__ self.test_case.assertFalse(self.has_shm_files()) AssertionError: True is not false ``` Please look at the table for details from the past 30 days such as * number of failed instances * an example url * which platforms it failed on * the number of times it failed on trunk vs on PRs.
process
disabled test fs pool main testmultiprocessing this test has been determined flaky through reruns in ci and its instances are reported in our flaky tests table here fail test fs pool main testmultiprocessing traceback most recent call last file test multiprocessing py line in test fs pool self test pool repeat test repeats file test multiprocessing py line in test pool do test file test multiprocessing py line in exit self test case assertfalse self has shm files assertionerror true is not false please look at the table for details from the past days such as number of failed instances an example url which platforms it failed on the number of times it failed on trunk vs on prs
1
7,957
11,137,566,387
IssuesEvent
2019-12-20 19:42:57
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
closed
Add "I do not have work experience" option on application
Apply Process Requirements Ready State Dept.
Who: State applicants What: Ability to select that they do not have work experience Why: To allow state to understand if someone didn't fill out info on purpose Issue: State has noted that many students did not enter work experience and think it's because they missed the section. They would like to make work experience required. Acceptance criteria: - Add a checkbox to allow the applicant to indicate "I do not wish to provide or do not have any work experience" - The applicant must either have selected that checkbox or entered at least one row of work experience to continue. (required) - Error messaging will indicate that work experience is required - remove the (optional) beside work experience - Update the intro sentence to read, "Please list your federal, military, professional, and volunteer experience here." Placement: - Add the checkbox under the "Add work experience" link - If the box is selected, gray out the "Add work experience" link - If at least one work experience row is added, do not display the checkbox Work experience not entered yet (current screen shot): ![image](https://user-images.githubusercontent.com/18709918/68772050-21c76c80-05f7-11ea-86cf-7582e86a724f.png) Work experience entered (current screen shot): ![image](https://user-images.githubusercontent.com/18709918/68772085-3146b580-05f7-11ea-8153-81bba8d8981d.png) USAJOBS Resume Builder Checkbox: ![image.png](https://images.zenhubusercontent.com/59ee08f1a468affe6df7cd6f/5f98bbab-2cf9-4d2d-9d44-eeb5874e98c6)
1.0
Add "I do not have work experience" option on application - Who: State applicants What: Ability to select that they do not have work experience Why: To allow state to understand if someone didn't fill out info on purpose Issue: State has noted that many students did not enter work experience and think it's because they missed the section. They would like to make work experience required. Acceptance criteria: - Add a checkbox to allow the applicant to indicate "I do not wish to provide or do not have any work experience" - The applicant must either have selected that checkbox or entered at least one row of work experience to continue. (required) - Error messaging will indicate that work experience is required - remove the (optional) beside work experience - Update the intro sentence to read, "Please list your federal, military, professional, and volunteer experience here." Placement: - Add the checkbox under the "Add work experience" link - If the box is selected, gray out the "Add work experience" link - If at least one work experience row is added, do not display the checkbox Work experience not entered yet (current screen shot): ![image](https://user-images.githubusercontent.com/18709918/68772050-21c76c80-05f7-11ea-86cf-7582e86a724f.png) Work experience entered (current screen shot): ![image](https://user-images.githubusercontent.com/18709918/68772085-3146b580-05f7-11ea-8153-81bba8d8981d.png) USAJOBS Resume Builder Checkbox: ![image.png](https://images.zenhubusercontent.com/59ee08f1a468affe6df7cd6f/5f98bbab-2cf9-4d2d-9d44-eeb5874e98c6)
process
add i do not have work experience option on application who state applicants what ability to select that they do not have work experience why to allow state to understand if someone didn t fill out info on purpose issue state has noted that many students did not enter work experience and think it s because they missed the section they would like to make work experience required acceptance criteria add a checkbox to allow the applicant to indicate i do not wish to provide or do not have any work experience the applicant must either have selected that checkbox or entered at least one row of work experience to continue required error messaging will indicate that work experience is required remove the optional beside work experience update the intro sentence to read please list your federal military professional and volunteer experience here placement add the checkbox under the add work experience link if the box is selected gray out the add work experience link if at least one work experience row is added do not display the checkbox work experience not entered yet current screen shot work experience entered current screen shot usajobs resume builder checkbox
1
16,432
21,315,163,779
IssuesEvent
2022-04-16 06:25:17
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Inability to use the python automationassets module on a windows hybrid runbook worker is a disadvantage
automation/svc triaged cxp doc-enhancement process-automation/subsvc Pri1
At least according to the [pages on managing Python packages](https://docs.microsoft.com/en-us/azure/automation/python-3-packages), the automationassets module can't be installed on a windows machine, so it can't be installed on a windows hybrid runbook worker. If that is indeed the case, it should be called out here as a disadvantage for Python. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 8081200f-2bf4-db58-c957-c8ab7af5f90b * Version Independent ID: b135cf1a-c391-03e5-41e7-e13571351e91 * Content: [Azure Automation runbook types](https://docs.microsoft.com/en-us/azure/automation/automation-runbook-types) * Content Source: [articles/automation/automation-runbook-types.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-types.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @SGSneha * Microsoft Alias: **v-ssudhir**
1.0
Inability to use the python automationassets module on a windows hybrid runbook worker is a disadvantage - At least according to the [pages on managing Python packages](https://docs.microsoft.com/en-us/azure/automation/python-3-packages), the automationassets module can't be installed on a windows machine, so it can't be installed on a windows hybrid runbook worker. If that is indeed the case, it should be called out here as a disadvantage for Python. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 8081200f-2bf4-db58-c957-c8ab7af5f90b * Version Independent ID: b135cf1a-c391-03e5-41e7-e13571351e91 * Content: [Azure Automation runbook types](https://docs.microsoft.com/en-us/azure/automation/automation-runbook-types) * Content Source: [articles/automation/automation-runbook-types.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-types.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @SGSneha * Microsoft Alias: **v-ssudhir**
process
inability to use the python automationassets module on a windows hybrid runbook worker is a disadvantage at least according to the the automationassets module can t be installed on a windows machine so it can t be installed on a windows hybrid runbook worker if that is indeed the case it should be called out here as a disadvantage for python document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login sgsneha microsoft alias v ssudhir
1
9,110
12,192,972,926
IssuesEvent
2020-04-29 13:45:26
naoki-shigehisa/paper
https://api.github.com/repos/naoki-shigehisa/paper
opened
Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models
2018 Gaussian Process
## 0. 論文 タイトル:[Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models](https://arxiv.org/abs/1810.06983) 著者: ![スクリーンショット 2020-04-29 22 44 07](https://user-images.githubusercontent.com/43877096/80603193-077d3000-8a6b-11ea-9017-1e2614edb907.png) arXiv投稿日:2018/10/16 学会/ジャーナル:ICML 2019 ## 1. どんなもの? ## 2. 先行研究と比べてどこがすごい? ## 3. 技術や手法のキモはどこ? ## 4. どうやって有効だと検証した? ## 5. 議論はある? ## 6. 次に読むべき論文は?
1.0
Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models - ## 0. 論文 タイトル:[Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models](https://arxiv.org/abs/1810.06983) 著者: ![スクリーンショット 2020-04-29 22 44 07](https://user-images.githubusercontent.com/43877096/80603193-077d3000-8a6b-11ea-9017-1e2614edb907.png) arXiv投稿日:2018/10/16 学会/ジャーナル:ICML 2019 ## 1. どんなもの? ## 2. 先行研究と比べてどこがすごい? ## 3. 技術や手法のキモはどこ? ## 4. どうやって有効だと検証した? ## 5. 議論はある? ## 6. 次に読むべき論文は?
process
decomposing feature level variation with covariate gaussian process latent variable models 論文 タイトル: 著者: arxiv投稿日: 学会 ジャーナル:icml どんなもの? 先行研究と比べてどこがすごい? 技術や手法のキモはどこ? どうやって有効だと検証した? 議論はある? 次に読むべき論文は?
1
625,637
19,759,343,452
IssuesEvent
2022-01-16 05:58:56
os-climate/os_c_data_commons
https://api.github.com/repos/os-climate/os_c_data_commons
opened
Rebuild Data Commons PROD / DEV on separate clusters
high priority
Going forward we want proper isolation between DEV and PROD environments for Data Commons, which means: - Configure two separate clusters and trino instances for DEV and PROD. This also means having two sets of configuration-as-code under os-climate/os_c_data_commons repository to drive deployment separately. Note: iceberg not being a component of ODH, it means we should drive the icerberg deployment fully from os-c. - Have a single catalog for DEV and PROD, based on an iceberg volume. - Re-build and re-run all data pipelines on the new DEV environment - Develop and implement a proper promotion process to PROD for data pipelines @MichaelTiemannOSC @erikerlandson
1.0
Rebuild Data Commons PROD / DEV on separate clusters - Going forward we want proper isolation between DEV and PROD environments for Data Commons, which means: - Configure two separate clusters and trino instances for DEV and PROD. This also means having two sets of configuration-as-code under os-climate/os_c_data_commons repository to drive deployment separately. Note: iceberg not being a component of ODH, it means we should drive the icerberg deployment fully from os-c. - Have a single catalog for DEV and PROD, based on an iceberg volume. - Re-build and re-run all data pipelines on the new DEV environment - Develop and implement a proper promotion process to PROD for data pipelines @MichaelTiemannOSC @erikerlandson
non_process
rebuild data commons prod dev on separate clusters going forward we want proper isolation between dev and prod environments for data commons which means configure two separate clusters and trino instances for dev and prod this also means having two sets of configuration as code under os climate os c data commons repository to drive deployment separately note iceberg not being a component of odh it means we should drive the icerberg deployment fully from os c have a single catalog for dev and prod based on an iceberg volume re build and re run all data pipelines on the new dev environment develop and implement a proper promotion process to prod for data pipelines michaeltiemannosc erikerlandson
0
14,869
3,428,000,329
IssuesEvent
2015-12-10 06:33:55
dotnet/wcf
https://api.github.com/repos/dotnet/wcf
closed
VerifyServiceIdentityMatchDnsEndpointIdentity test fails CoreCLR ToF
test bug TOF
Running Test: IdentityTests.VerifyServiceIdentityMatchDnsEndpointIdentity Caught Unexpected exception:System.NotSupportedException: The requested security protocol is not supported. at System.Net.SecurityProtocol.ThrowOnNotAllowed(SslProtocols protocols, Boolean allowNone) in E:\ProjectK\src\ndp\fxcore\Open\src\Common\src\System\Net\SecurityProtocol.cs:line 21 at System.Net.Security.SslStream.AuthenticateAsClientAsync(String targetHost, X509CertificateCollection clientCertificates, SslProtocols enabledSslProtocols, Boolean checkCertificateRevocation) in E:\ProjectK\src\ndp\fxcore\Open\src\System.Net.Security\src\System\Net\SecureProtocols\SslStream.cs:line 201 at System.ServiceModel.Channels.SslStreamSecurityUpgradeInitiator.d__19.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\SslStreamSecurityUpgradeProvider.cs:line 602 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.StreamSecurityUpgradeInitiatorBase.d__14.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\StreamSecurityUpgradeInitiatorBase.cs:line 91 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.ServiceModel.Channels.ConnectionUpgradeHelper.d__3.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 581 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 124 at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.d__10.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 256 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ConnectionPoolHelper.d__15.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ConnectionPoolHelper.cs:line 113 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.d__12.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 319 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__71.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 348 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ServiceChannel.d__149.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 1351 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__71.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 348 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__70.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 323 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ServiceChannel.CallOpenOnce.System.ServiceModel.Channels.ServiceChannel.ICallOnce.Call(ServiceChannel channel, TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 1976 at System.ServiceModel.Channels.ServiceChannel.CallOnceManager.CallOnce(TimeSpan timeout, CallOnceManager cascade) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 2043 at System.ServiceModel.Channels.ServiceChannel.EnsureOpened(TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 502 at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 747 at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(MethodCall methodCall, ProxyOperationRuntime operation) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannelProxy.cs:line 371 at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(MethodInfo targetMethod, Object[] args) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannelProxy.cs:line 136 --- End of stack trace from previous location where exception was thrown --- at System.Reflection.DispatchProxyGenerator.Invoke(Object[] args) in E:\ProjectK\src\ndp\fxcore\Open\src\System.Reflection.DispatchProxy\src\System\Reflection\DispatchProxyGenerator.cs:line 161 at generatedProxy_2.Echo(String ) at IdentityTests.VerifyServiceIdentityMatchDnsEndpointIdentity() in e:\ProjectK\src\NDP\FxCore\WcfOpen\src\System.Private.ServiceModel\tests\Scenarios\Security\TransportSecurity\Tcp\IdentityTests.cs:line 28 at EntryPointMain.b__13() in e:\ProjectK\src\QA\ToF\PK\x86\rel\IL\FX\Conformance\System.Private.ServiceModel\4.0.0.0\WcfOpen\Scenarios\Security\TransportSecurity\Security.TransportSecurity.Tests\Security.TransportSecurity.Tests.main.cs:line 32 at CoreFXTestLibrary.Internal.Runner.RunTestMethod(TestInfo t) in e:\ProjectK\src\QA\ToF\tests\FX\Conformance\Common\Xunit\Internal\Runner.cs:line 170 at CoreFXTestLibrary.Internal.Runner.RunTest(TestInfo t) in e:\ProjectK\src\QA\ToF\tests\FX\Conformance\Common\Xunit\Internal\Runner.cs:line 95 ---- Test FAILED ---------------
1.0
VerifyServiceIdentityMatchDnsEndpointIdentity test fails CoreCLR ToF - Running Test: IdentityTests.VerifyServiceIdentityMatchDnsEndpointIdentity Caught Unexpected exception:System.NotSupportedException: The requested security protocol is not supported. at System.Net.SecurityProtocol.ThrowOnNotAllowed(SslProtocols protocols, Boolean allowNone) in E:\ProjectK\src\ndp\fxcore\Open\src\Common\src\System\Net\SecurityProtocol.cs:line 21 at System.Net.Security.SslStream.AuthenticateAsClientAsync(String targetHost, X509CertificateCollection clientCertificates, SslProtocols enabledSslProtocols, Boolean checkCertificateRevocation) in E:\ProjectK\src\ndp\fxcore\Open\src\System.Net.Security\src\System\Net\SecureProtocols\SslStream.cs:line 201 at System.ServiceModel.Channels.SslStreamSecurityUpgradeInitiator.d__19.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\SslStreamSecurityUpgradeProvider.cs:line 602 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.StreamSecurityUpgradeInitiatorBase.d__14.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\StreamSecurityUpgradeInitiatorBase.cs:line 91 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.ServiceModel.Channels.ConnectionUpgradeHelper.d__3.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 581 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 124 at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.d__10.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 256 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ConnectionPoolHelper.d__15.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ConnectionPoolHelper.cs:line 113 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.d__12.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\FramingChannels.cs:line 319 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__71.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 348 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ServiceChannel.d__149.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 1351 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__71.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 348 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.CommunicationObject.d__70.MoveNext() in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\CommunicationObject.cs:line 323 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 172 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) in E:\ProjectK\src\NDP\clr\src\mscorlib\src\System\Runtime\CompilerServices\TaskAwaiter.cs:line 152 at System.ServiceModel.Channels.ServiceChannel.CallOpenOnce.System.ServiceModel.Channels.ServiceChannel.ICallOnce.Call(ServiceChannel channel, TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 1976 at System.ServiceModel.Channels.ServiceChannel.CallOnceManager.CallOnce(TimeSpan timeout, CallOnceManager cascade) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 2043 at System.ServiceModel.Channels.ServiceChannel.EnsureOpened(TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 502 at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannel.cs:line 747 at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(MethodCall methodCall, ProxyOperationRuntime operation) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannelProxy.cs:line 371 at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(MethodInfo targetMethod, Object[] args) in E:\ProjectK\src\ndp\fxcore\WcfOpen\src\System.Private.ServiceModel\src\System\ServiceModel\Channels\ServiceChannelProxy.cs:line 136 --- End of stack trace from previous location where exception was thrown --- at System.Reflection.DispatchProxyGenerator.Invoke(Object[] args) in E:\ProjectK\src\ndp\fxcore\Open\src\System.Reflection.DispatchProxy\src\System\Reflection\DispatchProxyGenerator.cs:line 161 at generatedProxy_2.Echo(String ) at IdentityTests.VerifyServiceIdentityMatchDnsEndpointIdentity() in e:\ProjectK\src\NDP\FxCore\WcfOpen\src\System.Private.ServiceModel\tests\Scenarios\Security\TransportSecurity\Tcp\IdentityTests.cs:line 28 at EntryPointMain.b__13() in e:\ProjectK\src\QA\ToF\PK\x86\rel\IL\FX\Conformance\System.Private.ServiceModel\4.0.0.0\WcfOpen\Scenarios\Security\TransportSecurity\Security.TransportSecurity.Tests\Security.TransportSecurity.Tests.main.cs:line 32 at CoreFXTestLibrary.Internal.Runner.RunTestMethod(TestInfo t) in e:\ProjectK\src\QA\ToF\tests\FX\Conformance\Common\Xunit\Internal\Runner.cs:line 170 at CoreFXTestLibrary.Internal.Runner.RunTest(TestInfo t) in e:\ProjectK\src\QA\ToF\tests\FX\Conformance\Common\Xunit\Internal\Runner.cs:line 95 ---- Test FAILED ---------------
non_process
verifyserviceidentitymatchdnsendpointidentity test fails coreclr tof running test identitytests verifyserviceidentitymatchdnsendpointidentity caught unexpected exception system notsupportedexception the requested security protocol is not supported at system net securityprotocol throwonnotallowed sslprotocols protocols boolean allownone in e projectk src ndp fxcore open src common src system net securityprotocol cs line at system net security sslstream authenticateasclientasync string targethost clientcertificates sslprotocols enabledsslprotocols boolean checkcertificaterevocation in e projectk src ndp fxcore open src system net security src system net secureprotocols sslstream cs line at system servicemodel channels sslstreamsecurityupgradeinitiator d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels sslstreamsecurityupgradeprovider cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels streamsecurityupgradeinitiatorbase d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels streamsecurityupgradeinitiatorbase cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels connectionupgradehelper d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels framingchannels cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter validateend task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels clientframingduplexsessionchannel d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels framingchannels cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels connectionpoolhelper d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels connectionpoolhelper cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels clientframingduplexsessionchannel d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels framingchannels cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels communicationobject d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels communicationobject cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels servicechannel d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannel cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels communicationobject d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels communicationobject cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels communicationobject d movenext in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels communicationobject cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task in e projectk src ndp clr src mscorlib src system runtime compilerservices taskawaiter cs line at system servicemodel channels servicechannel callopenonce system servicemodel channels servicechannel icallonce call servicechannel channel timespan timeout in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannel cs line at system servicemodel channels servicechannel calloncemanager callonce timespan timeout calloncemanager cascade in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannel cs line at system servicemodel channels servicechannel ensureopened timespan timeout in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannel cs line at system servicemodel channels servicechannel call string action boolean oneway proxyoperationruntime operation object ins object outs timespan timeout in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannel cs line at system servicemodel channels servicechannelproxy invokeservice methodcall methodcall proxyoperationruntime operation in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannelproxy cs line at system servicemodel channels servicechannelproxy invoke methodinfo targetmethod object args in e projectk src ndp fxcore wcfopen src system private servicemodel src system servicemodel channels servicechannelproxy cs line end of stack trace from previous location where exception was thrown at system reflection dispatchproxygenerator invoke object args in e projectk src ndp fxcore open src system reflection dispatchproxy src system reflection dispatchproxygenerator cs line at generatedproxy echo string at identitytests verifyserviceidentitymatchdnsendpointidentity in e projectk src ndp fxcore wcfopen src system private servicemodel tests scenarios security transportsecurity tcp identitytests cs line at entrypointmain b in e projectk src qa tof pk rel il fx conformance system private servicemodel wcfopen scenarios security transportsecurity security transportsecurity tests security transportsecurity tests main cs line at corefxtestlibrary internal runner runtestmethod testinfo t in e projectk src qa tof tests fx conformance common xunit internal runner cs line at corefxtestlibrary internal runner runtest testinfo t in e projectk src qa tof tests fx conformance common xunit internal runner cs line test failed
0
28,150
8,099,501,793
IssuesEvent
2018-08-11 09:32:27
ElektraInitiative/libelektra
https://api.github.com/repos/ElektraInitiative/libelektra
closed
pluginprocess: homepage build job fails
build low priority
https://build.libelektra.org/jenkins/job/elektra-homepage/lastFailedBuild/console ```sh Start 18: testlib_pluginprocess 17/111 Test #18: testlib_pluginprocess ............***Failed 0.19 sec test communication /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:121: error in test_communication: call to kdbOpen was not successful /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:123: error in test_communication: didn't store the pluginprocess struct in the plugin's data There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test emptyKeySet /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:175: error in test_emptyKeySet: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test reservedParentKeyName /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:213: error in test_reservedParentKeyName: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: /pluginprocess/parent/name reason: invalid number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : /pluginprocess/parent/name configfile: : invalid test keysetContainingParentKey /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:242: error in test_keysetContainingParentKey: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test closeWithoutOpen /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:369: error in test_closeWithoutOpen: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test childAddingParentKey /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:300: error in test_childAddingParentKey: call to kdbOpen was not successful /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:324: error in test_childAddingParentKey: parent key got removed from the keyset by pluginprocess There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : pluginprocess Results: 21 Tests done — 8 errors. ``` ## System Information - Elektra Version: master
1.0
pluginprocess: homepage build job fails - https://build.libelektra.org/jenkins/job/elektra-homepage/lastFailedBuild/console ```sh Start 18: testlib_pluginprocess 17/111 Test #18: testlib_pluginprocess ............***Failed 0.19 sec test communication /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:121: error in test_communication: call to kdbOpen was not successful /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:123: error in test_communication: didn't store the pluginprocess struct in the plugin's data There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test emptyKeySet /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:175: error in test_emptyKeySet: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test reservedParentKeyName /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:213: error in test_reservedParentKeyName: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: /pluginprocess/parent/name reason: invalid number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : /pluginprocess/parent/name configfile: : invalid test keysetContainingParentKey /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:242: error in test_keysetContainingParentKey: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test closeWithoutOpen /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:369: error in test_closeWithoutOpen: call to kdbOpen was not successful There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : test childAddingParentKey /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:300: error in test_childAddingParentKey: call to kdbOpen was not successful /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/tests/testlib_pluginprocess.c:324: error in test_childAddingParentKey: parent key got removed from the keyset by pluginprocess There are 1 warnings buffer is: warnings/#00 number: 190 description: Failed to initialize the pluginprocess library for a plugin ingroup: plugin module: pluginprocess file: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c line: 489 reason: Failed to initialize the pipenames and the dump plugin reason: user/tests/pluginprocess reason: number: 190 description: : Failed to initialize the pluginprocess library for a plugin ingroup: : plugin module: : pluginprocess at: /home/jenkins/workspace/elektra-homepage/src/libs/pluginprocess/pluginprocess.c:406 reason: : Failed to generate a temporary directory, mkdtemp returned errno 22 mountpoint: : user/tests/pluginprocess configfile: : pluginprocess Results: 21 Tests done — 8 errors. ``` ## System Information - Elektra Version: master
non_process
pluginprocess homepage build job fails sh start testlib pluginprocess test testlib pluginprocess failed sec test communication home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test communication call to kdbopen was not successful home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test communication didn t store the pluginprocess struct in the plugin s data there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason user tests pluginprocess reason number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint user tests pluginprocess configfile test emptykeyset home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test emptykeyset call to kdbopen was not successful there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason user tests pluginprocess reason number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint user tests pluginprocess configfile test reservedparentkeyname home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test reservedparentkeyname call to kdbopen was not successful there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason pluginprocess parent name reason invalid number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint pluginprocess parent name configfile invalid test keysetcontainingparentkey home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test keysetcontainingparentkey call to kdbopen was not successful there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason user tests pluginprocess reason number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint user tests pluginprocess configfile test closewithoutopen home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test closewithoutopen call to kdbopen was not successful there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason user tests pluginprocess reason number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint user tests pluginprocess configfile test childaddingparentkey home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test childaddingparentkey call to kdbopen was not successful home jenkins workspace elektra homepage src libs pluginprocess tests testlib pluginprocess c error in test childaddingparentkey parent key got removed from the keyset by pluginprocess there are warnings buffer is warnings number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess file home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c line reason failed to initialize the pipenames and the dump plugin reason user tests pluginprocess reason number description failed to initialize the pluginprocess library for a plugin ingroup plugin module pluginprocess at home jenkins workspace elektra homepage src libs pluginprocess pluginprocess c reason failed to generate a temporary directory mkdtemp returned errno mountpoint user tests pluginprocess configfile pluginprocess results tests done — errors system information elektra version master
0
50,898
12,601,913,215
IssuesEvent
2020-06-11 10:43:15
googleapis/nodejs-spanner
https://api.github.com/repos/googleapis/nodejs-spanner
closed
Mocha Tests: should query an example table with a non-null timestamp column and return matching rows failed
api: spanner buildcop: issue priority: p1 type: bug
Note: #893 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 8973cbcc9158b37650af2edb6015c575da1cc3ec buildURL: [Build Status](https://source.cloud.google.com/results/invocations/8ddd9032-02b7-43d0-9fd1-2d1e7b9a7841), [Sponge](http://sponge2/8ddd9032-02b7-43d0-9fd1-2d1e7b9a7841) status: failed <details><summary>Test output</summary><br><pre>expected '' to match /SingerId: 1, VenueId: 4, EventDate:/ AssertionError: expected '' to match /SingerId: 1, VenueId: 4, EventDate:/ at Context.it (system-test/spanner.test.js:473:12)</pre></details>
1.0
Mocha Tests: should query an example table with a non-null timestamp column and return matching rows failed - Note: #893 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 8973cbcc9158b37650af2edb6015c575da1cc3ec buildURL: [Build Status](https://source.cloud.google.com/results/invocations/8ddd9032-02b7-43d0-9fd1-2d1e7b9a7841), [Sponge](http://sponge2/8ddd9032-02b7-43d0-9fd1-2d1e7b9a7841) status: failed <details><summary>Test output</summary><br><pre>expected '' to match /SingerId: 1, VenueId: 4, EventDate:/ AssertionError: expected '' to match /SingerId: 1, VenueId: 4, EventDate:/ at Context.it (system-test/spanner.test.js:473:12)</pre></details>
non_process
mocha tests should query an example table with a non null timestamp column and return matching rows failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output expected to match singerid venueid eventdate assertionerror expected to match singerid venueid eventdate at context it system test spanner test js
0
1,340
3,900,504,780
IssuesEvent
2016-04-18 06:30:49
DynareTeam/dynare
https://api.github.com/repos/DynareTeam/dynare
closed
auxiliary variables in steady state and static model
bug preprocessor
Currently, in static model and set_auxiliary_variables.m, auxiliary variables may appear in the RHS of equations. This is problematic for two reasons: 1. assignments may not be recurice in set_auxiliary_variables.m which leads to errors 2. <fname>_static.m may not be solved independently from set_auxiliary_variables.m In the static deterministic model, any auxiliary variable, except Lagrange multipliers for Ramsey policy, can be replaced with an expression using only original variables. In the preprocessor, I want to rewrite <fname>_static.m and <fname>_set_auxiliary_variables.m along these lines. I expect this changes to solve #1119 and #633
1.0
auxiliary variables in steady state and static model - Currently, in static model and set_auxiliary_variables.m, auxiliary variables may appear in the RHS of equations. This is problematic for two reasons: 1. assignments may not be recurice in set_auxiliary_variables.m which leads to errors 2. <fname>_static.m may not be solved independently from set_auxiliary_variables.m In the static deterministic model, any auxiliary variable, except Lagrange multipliers for Ramsey policy, can be replaced with an expression using only original variables. In the preprocessor, I want to rewrite <fname>_static.m and <fname>_set_auxiliary_variables.m along these lines. I expect this changes to solve #1119 and #633
process
auxiliary variables in steady state and static model currently in static model and set auxiliary variables m auxiliary variables may appear in the rhs of equations this is problematic for two reasons assignments may not be recurice in set auxiliary variables m which leads to errors static m may not be solved independently from set auxiliary variables m in the static deterministic model any auxiliary variable except lagrange multipliers for ramsey policy can be replaced with an expression using only original variables in the preprocessor i want to rewrite static m and set auxiliary variables m along these lines i expect this changes to solve and
1
146,558
13,185,225,199
IssuesEvent
2020-08-12 20:58:28
icecube-trac/tix3
https://api.github.com/repos/icecube-trac/tix3
opened
doc cleanups (Trac #723)
Incomplete Migration Migrated from Trac cleanup documentation
<details> <summary><em>Migrated from https://code.icecube.wisc.edu/ticket/723 , reported by david.schultz and owned by </em></summary> <p> ```json { "status": "closed", "changetime": "2015-02-11T21:29:02", "description": "people have noticed that the platform documentation is getting out of date. maybe clean this up for v5?", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "_ts": "1423690142510740", "component": "documentation", "summary": "doc cleanups", "priority": "minor", "keywords": "", "time": "2014-04-11T15:28:32", "milestone": "", "owner": "", "type": "cleanup" } ``` </p> </details>
1.0
doc cleanups (Trac #723) - <details> <summary><em>Migrated from https://code.icecube.wisc.edu/ticket/723 , reported by david.schultz and owned by </em></summary> <p> ```json { "status": "closed", "changetime": "2015-02-11T21:29:02", "description": "people have noticed that the platform documentation is getting out of date. maybe clean this up for v5?", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "_ts": "1423690142510740", "component": "documentation", "summary": "doc cleanups", "priority": "minor", "keywords": "", "time": "2014-04-11T15:28:32", "milestone": "", "owner": "", "type": "cleanup" } ``` </p> </details>
non_process
doc cleanups trac migrated from reported by david schultz and owned by json status closed changetime description people have noticed that the platform documentation is getting out of date maybe clean this up for reporter david schultz cc resolution fixed ts component documentation summary doc cleanups priority minor keywords time milestone owner type cleanup
0
238,284
18,237,639,984
IssuesEvent
2021-10-01 08:59:28
priyanshbalyan/discord-file-storage
https://api.github.com/repos/priyanshbalyan/discord-file-storage
opened
Add meaningful documentation
documentation good first issue Hacktoberfest
Add comment `#Add your discord bot token here` just above the `getSizeFormat` function in file `fs.py` Claim this issue first by commenting here to prevent multiple people working on the same issue. Raise a PR with the description `Fixes #3` Step for completing the task: 1. Fork the repository 2. Make you necessary changes in `fs.py` 3. Submit a PR Leave a star if you like the project!
1.0
Add meaningful documentation - Add comment `#Add your discord bot token here` just above the `getSizeFormat` function in file `fs.py` Claim this issue first by commenting here to prevent multiple people working on the same issue. Raise a PR with the description `Fixes #3` Step for completing the task: 1. Fork the repository 2. Make you necessary changes in `fs.py` 3. Submit a PR Leave a star if you like the project!
non_process
add meaningful documentation add comment add your discord bot token here just above the getsizeformat function in file fs py claim this issue first by commenting here to prevent multiple people working on the same issue raise a pr with the description fixes step for completing the task fork the repository make you necessary changes in fs py submit a pr leave a star if you like the project
0
764,794
26,817,618,545
IssuesEvent
2023-02-02 06:43:17
ArcanePlugins/Treasury
https://api.github.com/repos/ArcanePlugins/Treasury
closed
Expanding #246
type: improvement priority: normal status: confirmed core team task
I realised I missed a few things in #247 to resolve #246. - The reason we had `getStartingBalance` implemented inside `Currency` is because we want whoever is registering the `Currency` to provide the logic for giving the starting balance for a particular `Account`. - However, the change we made for making `getStartingBalance` accept an `Account` rather than a player `UUID` is still good. - So all we need to do is move the new `getStartingBalance` method back into `Currency`. - And we forgot to do something: make the `decimal` character mapped to a `Locale`.
1.0
Expanding #246 - I realised I missed a few things in #247 to resolve #246. - The reason we had `getStartingBalance` implemented inside `Currency` is because we want whoever is registering the `Currency` to provide the logic for giving the starting balance for a particular `Account`. - However, the change we made for making `getStartingBalance` accept an `Account` rather than a player `UUID` is still good. - So all we need to do is move the new `getStartingBalance` method back into `Currency`. - And we forgot to do something: make the `decimal` character mapped to a `Locale`.
non_process
expanding i realised i missed a few things in to resolve the reason we had getstartingbalance implemented inside currency is because we want whoever is registering the currency to provide the logic for giving the starting balance for a particular account however the change we made for making getstartingbalance accept an account rather than a player uuid is still good so all we need to do is move the new getstartingbalance method back into currency and we forgot to do something make the decimal character mapped to a locale
0
31,185
7,328,074,347
IssuesEvent
2018-03-04 17:10:38
jpromerob/HL2019_Project
https://api.github.com/repos/jpromerob/HL2019_Project
closed
Create Appendix for MLEM results
Code
Based on Appendix A. We should have Appendix B showing how synthetic sinograms can be reconstructed by using MLEM. For that: - automatic Figure generation must be implemented for MLEM (using saveFigure.m). - the *.tex file must be updated **(already done)** The idea is to have something like this: ![image](https://user-images.githubusercontent.com/34107797/36936320-2d1b3d9a-1f04-11e8-9ba4-05981af87629.png)
1.0
Create Appendix for MLEM results - Based on Appendix A. We should have Appendix B showing how synthetic sinograms can be reconstructed by using MLEM. For that: - automatic Figure generation must be implemented for MLEM (using saveFigure.m). - the *.tex file must be updated **(already done)** The idea is to have something like this: ![image](https://user-images.githubusercontent.com/34107797/36936320-2d1b3d9a-1f04-11e8-9ba4-05981af87629.png)
non_process
create appendix for mlem results based on appendix a we should have appendix b showing how synthetic sinograms can be reconstructed by using mlem for that automatic figure generation must be implemented for mlem using savefigure m the tex file must be updated already done the idea is to have something like this
0
22,225
30,773,305,404
IssuesEvent
2023-07-31 03:03:47
evan-bradley/opentelemetry-collector-contrib
https://api.github.com/repos/evan-bradley/opentelemetry-collector-contrib
closed
[exporter/doesntexist] test issue
pkg/doesntexist exporter/doesntexist needs triage processor/doesntexist
### Component(s) pkg/doesntexist, processor/doesntexist ### Describe the issue you're reporting test
1.0
[exporter/doesntexist] test issue - ### Component(s) pkg/doesntexist, processor/doesntexist ### Describe the issue you're reporting test
process
test issue component s pkg doesntexist processor doesntexist describe the issue you re reporting test
1
705,413
24,233,614,878
IssuesEvent
2022-09-26 20:38:04
status-im/status-desktop
https://api.github.com/repos/status-im/status-desktop
closed
[User Profile] ens name issues
ui priority 1: high
### Description User profile, when it has ENS it should be `@NAME.eth` - Should also not render identicon ring
1.0
[User Profile] ens name issues - ### Description User profile, when it has ENS it should be `@NAME.eth` - Should also not render identicon ring
non_process
ens name issues description user profile when it has ens it should be name eth should also not render identicon ring
0
434
2,866,160,313
IssuesEvent
2015-06-05 04:19:37
mobeets/medialog
https://api.github.com/repos/mobeets/medialog
closed
comments with 'tags' or 'reviews' are the only visible ones
preprocessing
That way, if I do decide to write a sort of "final" review for a book, I can mark that in some way, while still letting previous posts have the sort of half-assed reaction-comments, if there isn't a final polished review.
1.0
comments with 'tags' or 'reviews' are the only visible ones - That way, if I do decide to write a sort of "final" review for a book, I can mark that in some way, while still letting previous posts have the sort of half-assed reaction-comments, if there isn't a final polished review.
process
comments with tags or reviews are the only visible ones that way if i do decide to write a sort of final review for a book i can mark that in some way while still letting previous posts have the sort of half assed reaction comments if there isn t a final polished review
1
18,264
24,344,978,239
IssuesEvent
2022-10-02 07:18:43
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Grammar in Title
automation/svc triaged cxp doc-enhancement process-automation/subsvc Pri2
The title "Send an email from **am** Automation runbook" should be changed to "Send an email from **an** Automation runbook" --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: f3d18144-dd03-c659-f0a4-c0ab13d3d343 * Version Independent ID: ecdcbda2-259f-4992-57d8-c448a3934011 * Content: [Send an email from an Azure Automation runbook](https://docs.microsoft.com/en-us/azure/automation/automation-send-email?source=docs) * Content Source: [articles/automation/automation-send-email.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-send-email.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @SnehaSudhirG * Microsoft Alias: **sudhirsneha**
1.0
Grammar in Title - The title "Send an email from **am** Automation runbook" should be changed to "Send an email from **an** Automation runbook" --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: f3d18144-dd03-c659-f0a4-c0ab13d3d343 * Version Independent ID: ecdcbda2-259f-4992-57d8-c448a3934011 * Content: [Send an email from an Azure Automation runbook](https://docs.microsoft.com/en-us/azure/automation/automation-send-email?source=docs) * Content Source: [articles/automation/automation-send-email.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-send-email.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @SnehaSudhirG * Microsoft Alias: **sudhirsneha**
process
grammar in title the title send an email from am automation runbook should be changed to send an email from an automation runbook document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehasudhirg microsoft alias sudhirsneha
1
17,472
5,424,653,560
IssuesEvent
2017-03-03 02:04:45
garlicbulb-puzhuo/ggo
https://api.github.com/repos/garlicbulb-puzhuo/ggo
closed
Remove old branches
code enhancement
As soon as possible... @garlicbulb-puzhuo @garlicbulbxian @oakcreek We're all assignees for this one. Let's do this soon.
1.0
Remove old branches - As soon as possible... @garlicbulb-puzhuo @garlicbulbxian @oakcreek We're all assignees for this one. Let's do this soon.
non_process
remove old branches as soon as possible garlicbulb puzhuo garlicbulbxian oakcreek we re all assignees for this one let s do this soon
0
279
2,715,297,769
IssuesEvent
2015-04-10 12:06:51
cfpb/hmda-viz-prototype
https://api.github.com/repos/cfpb/hmda-viz-prototype
opened
8's - Race instead of Races
Processing
Small one - `"characteristic": "Races", ` should be `"characteristic": "Race", ` (no 's').
1.0
8's - Race instead of Races - Small one - `"characteristic": "Races", ` should be `"characteristic": "Race", ` (no 's').
process
s race instead of races small one characteristic races should be characteristic race no s
1
6,101
2,610,221,184
IssuesEvent
2015-02-26 19:10:11
chrsmith/somefinders
https://api.github.com/repos/chrsmith/somefinders
opened
мамина песенка михаил пляцковский mp3
auto-migrated Priority-Medium Type-Defect
``` '''Бертольд Пономарёв''' Привет всем не подскажите где можно найти .мамина песенка михаил пляцковский mp3. как то выкладывали уже '''Винцент Кабанов''' Качай тут http://bit.ly/1dUbk9J '''Адонис Русаков''' Просит ввести номер мобилы!Не опасно ли это? '''Герасим Смирнов''' Не это не влияет на баланс '''Вильгельм Шубин''' Не это не влияет на баланс Информация о файле: мамина песенка михаил пляцковский mp3 Загружен: В этом месяце Скачан раз: 1133 Рейтинг: 1300 Средняя скорость скачивания: 149 Похожих файлов: 38 ``` ----- Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 9:46
1.0
мамина песенка михаил пляцковский mp3 - ``` '''Бертольд Пономарёв''' Привет всем не подскажите где можно найти .мамина песенка михаил пляцковский mp3. как то выкладывали уже '''Винцент Кабанов''' Качай тут http://bit.ly/1dUbk9J '''Адонис Русаков''' Просит ввести номер мобилы!Не опасно ли это? '''Герасим Смирнов''' Не это не влияет на баланс '''Вильгельм Шубин''' Не это не влияет на баланс Информация о файле: мамина песенка михаил пляцковский mp3 Загружен: В этом месяце Скачан раз: 1133 Рейтинг: 1300 Средняя скорость скачивания: 149 Похожих файлов: 38 ``` ----- Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 9:46
non_process
мамина песенка михаил пляцковский бертольд пономарёв привет всем не подскажите где можно найти мамина песенка михаил пляцковский как то выкладывали уже винцент кабанов качай тут адонис русаков просит ввести номер мобилы не опасно ли это герасим смирнов не это не влияет на баланс вильгельм шубин не это не влияет на баланс информация о файле мамина песенка михаил пляцковский загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at
0
168,147
20,742,252,497
IssuesEvent
2022-03-14 18:52:58
snowflakedb/snowflake-jdbc
https://api.github.com/repos/snowflakedb/snowflake-jdbc
closed
SNOW-558919: CVE-2020-36183 (High) detected in jackson-databind-2.9.8.jar - autoclosed
security vulnerability
## CVE-2020-36183 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-ua_20220312003410_KQQKCS/archiveExtraction_BELGGC/FUIDAN/20220312003410/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml</p> <p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool. <p>Publish Date: 2021-01-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183>CVE-2020-36183</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3003">https://github.com/FasterXML/jackson-databind/issues/3003</a></p> <p>Release Date: 2021-01-07</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tmp/ws-ua_20220312003410_KQQKCS/archiveExtraction_BELGGC/FUIDAN/20220312003410/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36183","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
SNOW-558919: CVE-2020-36183 (High) detected in jackson-databind-2.9.8.jar - autoclosed - ## CVE-2020-36183 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-ua_20220312003410_KQQKCS/archiveExtraction_BELGGC/FUIDAN/20220312003410/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml</p> <p>Path to vulnerable library: /sitory/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool. <p>Publish Date: 2021-01-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183>CVE-2020-36183</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3003">https://github.com/FasterXML/jackson-databind/issues/3003</a></p> <p>Release Date: 2021-01-07</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tmp/ws-ua_20220312003410_KQQKCS/archiveExtraction_BELGGC/FUIDAN/20220312003410/snowflake-jdbc_depth_0/dependencies/arrow-vector-0.15.1/META-INF/maven/org.apache.arrow/arrow-vector/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36183","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
snow cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws ua kqqkcs archiveextraction belggc fuidan snowflake jdbc depth dependencies arrow vector meta inf maven org apache arrow arrow vector pom xml path to vulnerable library sitory com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org org apache xalan lib sql jndiconnectionpool publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org org apache xalan lib sql jndiconnectionpool vulnerabilityurl
0
8,782
11,902,275,144
IssuesEvent
2020-03-30 13:44:41
MHRA/products
https://api.github.com/repos/MHRA/products
closed
AUTO BATCH PROCESS - Indexing Strategy
EPIC - Auto Batch Process :oncoming_automobile: HIGH PRIORITY :arrow_double_up: TASK :rescue_worker_helmet:
### User want As a user I want to see up to date documents on the products website So I can make informed decisions **Customer acceptance criteria** **Technical acceptance criteria** An understanding and plan of how the search index will be kept up to date with changes. If both new files and deleted files get automatically updated in the index successfully as part of those tickets then this is probably not needed. Or the indexer may be set to run once a week in order to synchronize what's actually in the container in case they get out of sync. **Data acceptance criteria** **Testing acceptance criteria** **Size** S **Value** **Effort** ### Exit Criteria met - [ ] Backlog - [ ] Discovery - [ ] DUXD - [ ] Development - [ ] Quality Assurance - [ ] Release and Validate
1.0
AUTO BATCH PROCESS - Indexing Strategy - ### User want As a user I want to see up to date documents on the products website So I can make informed decisions **Customer acceptance criteria** **Technical acceptance criteria** An understanding and plan of how the search index will be kept up to date with changes. If both new files and deleted files get automatically updated in the index successfully as part of those tickets then this is probably not needed. Or the indexer may be set to run once a week in order to synchronize what's actually in the container in case they get out of sync. **Data acceptance criteria** **Testing acceptance criteria** **Size** S **Value** **Effort** ### Exit Criteria met - [ ] Backlog - [ ] Discovery - [ ] DUXD - [ ] Development - [ ] Quality Assurance - [ ] Release and Validate
process
auto batch process indexing strategy user want as a user i want to see up to date documents on the products website so i can make informed decisions customer acceptance criteria technical acceptance criteria an understanding and plan of how the search index will be kept up to date with changes if both new files and deleted files get automatically updated in the index successfully as part of those tickets then this is probably not needed or the indexer may be set to run once a week in order to synchronize what s actually in the container in case they get out of sync data acceptance criteria testing acceptance criteria size s value effort exit criteria met backlog discovery duxd development quality assurance release and validate
1
46,225
13,152,203,798
IssuesEvent
2020-08-09 20:53:21
Jacksole/Learning-JavaScript
https://api.github.com/repos/Jacksole/Learning-JavaScript
closed
CVE-2012-6708 (Medium) detected in jquery-1.4.4.min.js, jquery-1.7.1.min.js
security vulnerability
## CVE-2012-6708 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.4.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.4.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/Learning-JavaScript/AngularJS/storfront/node_modules/selenium-webdriver/lib/test/data/mousePositionTracker.html</p> <p>Path to vulnerable library: /Learning-JavaScript/AngularJS/storfront/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.4.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/multiplex/index.html</p> <p>Path to vulnerable library: /Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/express/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/express/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/express-3.x/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/Jacksole/Learning-JavaScript/commit/c9ca295725f33eb0d8e03f930a5da88ebb01cedf">c9ca295725f33eb0d8e03f930a5da88ebb01cedf</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 1.9.0 is vulnerable to Cross-site Scripting (XSS) attacks. The jQuery(strInput) function does not differentiate selectors from HTML in a reliable fashion. In vulnerable versions, jQuery determined whether the input was HTML by looking for the '<' character anywhere in the string, giving attackers more flexibility when attempting to construct a malicious payload. In fixed versions, jQuery only deems the input to be HTML if it explicitly starts with the '<' character, limiting exploitability only to attackers who can control the beginning of a string, which is far less common. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2012-6708>CVE-2012-6708</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2012-6708">https://nvd.nist.gov/vuln/detail/CVE-2012-6708</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2012-6708 (Medium) detected in jquery-1.4.4.min.js, jquery-1.7.1.min.js - ## CVE-2012-6708 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.4.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.4.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/Learning-JavaScript/AngularJS/storfront/node_modules/selenium-webdriver/lib/test/data/mousePositionTracker.html</p> <p>Path to vulnerable library: /Learning-JavaScript/AngularJS/storfront/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.4.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/multiplex/index.html</p> <p>Path to vulnerable library: /Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/express/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/React/mern-todo-app/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/fullstack_app/client/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/express/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/multiplex/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/echo/index.html,/Learning-JavaScript/AngularJS/storfront/node_modules/sockjs/examples/express-3.x/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/hapi/html/index.html,/Learning-JavaScript/React/react-form-validation-demo/node_modules/sockjs/examples/express-3.x/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/Jacksole/Learning-JavaScript/commit/c9ca295725f33eb0d8e03f930a5da88ebb01cedf">c9ca295725f33eb0d8e03f930a5da88ebb01cedf</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 1.9.0 is vulnerable to Cross-site Scripting (XSS) attacks. The jQuery(strInput) function does not differentiate selectors from HTML in a reliable fashion. In vulnerable versions, jQuery determined whether the input was HTML by looking for the '<' character anywhere in the string, giving attackers more flexibility when attempting to construct a malicious payload. In fixed versions, jQuery only deems the input to be HTML if it explicitly starts with the '<' character, limiting exploitability only to attackers who can control the beginning of a string, which is far less common. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2012-6708>CVE-2012-6708</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2012-6708">https://nvd.nist.gov/vuln/detail/CVE-2012-6708</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery min js jquery min js cve medium severity vulnerability vulnerable libraries jquery min js jquery min js jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm learning javascript angularjs storfront node modules selenium webdriver lib test data mousepositiontracker html path to vulnerable library learning javascript angularjs storfront node modules selenium webdriver lib test data js jquery min js dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm learning javascript react mern todo app node modules sockjs examples multiplex index html path to vulnerable library learning javascript react mern todo app node modules sockjs examples multiplex index html learning javascript react fullstack app client node modules sockjs examples multiplex index html learning javascript angularjs storfront node modules sockjs examples express index html learning javascript react mern todo app node modules sockjs examples express x index html learning javascript react fullstack app client node modules sockjs examples express x index html learning javascript react mern todo app node modules sockjs examples echo index html learning javascript react react form validation demo node modules sockjs examples echo index html learning javascript react mern todo app node modules sockjs examples hapi html index html learning javascript react fullstack app client node modules sockjs examples hapi html index html learning javascript react fullstack app client node modules sockjs examples echo index html learning javascript angularjs storfront node modules sockjs examples hapi html index html learning javascript react react form validation demo node modules sockjs examples express index html learning javascript react react form validation demo node modules sockjs examples multiplex index html learning javascript angularjs storfront node modules sockjs examples multiplex index html learning javascript angularjs storfront node modules sockjs examples echo index html learning javascript angularjs storfront node modules sockjs examples express x index html learning javascript react react form validation demo node modules sockjs examples hapi html index html learning javascript react react form validation demo node modules sockjs examples express x index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery before is vulnerable to cross site scripting xss attacks the jquery strinput function does not differentiate selectors from html in a reliable fashion in vulnerable versions jquery determined whether the input was html by looking for the character anywhere in the string giving attackers more flexibility when attempting to construct a malicious payload in fixed versions jquery only deems the input to be html if it explicitly starts with the character limiting exploitability only to attackers who can control the beginning of a string which is far less common publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
0
378,807
26,339,054,256
IssuesEvent
2023-01-10 16:19:25
palomachain/docs
https://api.github.com/repos/palomachain/docs
closed
BUG: The Governance Page returns a 404 Error.
bug documentation
The Governance page: https://docs.palomachain.com/guide/maintain/governance/governance.html returns a 404 error. Are we missing this content?
1.0
BUG: The Governance Page returns a 404 Error. - The Governance page: https://docs.palomachain.com/guide/maintain/governance/governance.html returns a 404 error. Are we missing this content?
non_process
bug the governance page returns a error the governance page returns a error are we missing this content
0
3,248
6,313,732,653
IssuesEvent
2017-07-24 08:58:58
itsyouonline/identityserver
https://api.github.com/repos/itsyouonline/identityserver
closed
Contract signatures on blockchain
process_duplicate type_feature
Migrate the signature storage for contracts from our mongodb to a blockchain; - Start by doing this in ethereum - Make sure we can easily migrate to our own rivine based solution in the future
1.0
Contract signatures on blockchain - Migrate the signature storage for contracts from our mongodb to a blockchain; - Start by doing this in ethereum - Make sure we can easily migrate to our own rivine based solution in the future
process
contract signatures on blockchain migrate the signature storage for contracts from our mongodb to a blockchain start by doing this in ethereum make sure we can easily migrate to our own rivine based solution in the future
1
565,668
16,766,824,617
IssuesEvent
2021-06-14 09:52:18
ballerina-platform/ballerina-lang
https://api.github.com/repos/ballerina-platform/ballerina-lang
closed
No Such Field Error when using nested anonymous record arrays with element type of record having quoted identifier
Crash Priority/Blocker Team/jBallerina Type/Bug
**Description:** The issue happens when the record type is defined in a separate module. This can happen record full qualified name having quoted identifiers as well. We can reproduce with the following code. Module 1 ``` public type Person\$ record {| string name = "default"; |}; ``` Module 2 ``` public function main() { var e = getAnonFromPerson(); } function getAnonFromPerson() returns record {| string name; |}[] { mod1:Person\$[] arr = []; return arr; } ``` Error ``` [2021-06-11 10:17:15,684] SEVERE {b7a.log.crash} - $typedesce$Person\$ java.lang.NoSuchFieldError: $typedesce$Person\$ at waruna.project1.0_1_0.main.getAnonFromPerson(main.bal:23) at waruna.project1.0_1_0.main.main(main.bal:4) at waruna.project1.0_1_0.$_init.$lambda$main$(project1) at io.ballerina.runtime.internal.scheduling.SchedulerItem.execute(Scheduler.java:597) at io.ballerina.runtime.internal.scheduling.Scheduler.run(Scheduler.java:327) at io.ballerina.runtime.internal.scheduling.Scheduler.runSafely(Scheduler.java:295) at java.base/java.lang.Thread.run(Thread.java:834) ```
1.0
No Such Field Error when using nested anonymous record arrays with element type of record having quoted identifier - **Description:** The issue happens when the record type is defined in a separate module. This can happen record full qualified name having quoted identifiers as well. We can reproduce with the following code. Module 1 ``` public type Person\$ record {| string name = "default"; |}; ``` Module 2 ``` public function main() { var e = getAnonFromPerson(); } function getAnonFromPerson() returns record {| string name; |}[] { mod1:Person\$[] arr = []; return arr; } ``` Error ``` [2021-06-11 10:17:15,684] SEVERE {b7a.log.crash} - $typedesce$Person\$ java.lang.NoSuchFieldError: $typedesce$Person\$ at waruna.project1.0_1_0.main.getAnonFromPerson(main.bal:23) at waruna.project1.0_1_0.main.main(main.bal:4) at waruna.project1.0_1_0.$_init.$lambda$main$(project1) at io.ballerina.runtime.internal.scheduling.SchedulerItem.execute(Scheduler.java:597) at io.ballerina.runtime.internal.scheduling.Scheduler.run(Scheduler.java:327) at io.ballerina.runtime.internal.scheduling.Scheduler.runSafely(Scheduler.java:295) at java.base/java.lang.Thread.run(Thread.java:834) ```
non_process
no such field error when using nested anonymous record arrays with element type of record having quoted identifier description the issue happens when the record type is defined in a separate module this can happen record full qualified name having quoted identifiers as well we can reproduce with the following code module public type person record string name default module public function main var e getanonfromperson function getanonfromperson returns record string name person arr return arr error severe log crash typedesce person java lang nosuchfielderror typedesce person at waruna main getanonfromperson main bal at waruna main main main bal at waruna init lambda main at io ballerina runtime internal scheduling scheduleritem execute scheduler java at io ballerina runtime internal scheduling scheduler run scheduler java at io ballerina runtime internal scheduling scheduler runsafely scheduler java at java base java lang thread run thread java
0
671,566
22,767,157,387
IssuesEvent
2022-07-08 06:14:52
bitfoundation/bitplatform
https://api.github.com/repos/bitfoundation/bitplatform
closed
Incorrect direction of the months' cells in `Jalali BitDatePicker` component
bug area / components high priority
As shown in the picture below, the direction of the months' cells in the `Jalali BitDatePicker` is `ltr`. This direction causes the months' names to be written from left to right therefore the beginning of their names is displayed with a dotted line. ![freesnippingtool com_capture_20220707202203](https://user-images.githubusercontent.com/48528003/177819689-d0f66dd1-77de-44ee-a9a8-0a271fc2c6d6.png) It is expected the months' names be written from right to left when the calendar is set to the `Jalali`.
1.0
Incorrect direction of the months' cells in `Jalali BitDatePicker` component - As shown in the picture below, the direction of the months' cells in the `Jalali BitDatePicker` is `ltr`. This direction causes the months' names to be written from left to right therefore the beginning of their names is displayed with a dotted line. ![freesnippingtool com_capture_20220707202203](https://user-images.githubusercontent.com/48528003/177819689-d0f66dd1-77de-44ee-a9a8-0a271fc2c6d6.png) It is expected the months' names be written from right to left when the calendar is set to the `Jalali`.
non_process
incorrect direction of the months cells in jalali bitdatepicker component as shown in the picture below the direction of the months cells in the jalali bitdatepicker is ltr this direction causes the months names to be written from left to right therefore the beginning of their names is displayed with a dotted line it is expected the months names be written from right to left when the calendar is set to the jalali
0
692
3,184,364,221
IssuesEvent
2015-09-27 09:18:55
sysown/proxysql-0.2
https://api.github.com/repos/sysown/proxysql-0.2
opened
Add the ability to match queries against their digest
ADMIN QUERY PROCESSOR
## Why? When processing query rules, ProxySQL is able to match queries using a regex on the query itself. If the query itself has a lot of parameters (long INSERT for example) this could be unnecessary expensive, and from regex against the query it is not possible to distinguish between parameters or not. ## What * [ ] add a new column in myql_query_rules : match_digest * [ ] support the new field adding a new engine * [ ] document it
1.0
Add the ability to match queries against their digest - ## Why? When processing query rules, ProxySQL is able to match queries using a regex on the query itself. If the query itself has a lot of parameters (long INSERT for example) this could be unnecessary expensive, and from regex against the query it is not possible to distinguish between parameters or not. ## What * [ ] add a new column in myql_query_rules : match_digest * [ ] support the new field adding a new engine * [ ] document it
process
add the ability to match queries against their digest why when processing query rules proxysql is able to match queries using a regex on the query itself if the query itself has a lot of parameters long insert for example this could be unnecessary expensive and from regex against the query it is not possible to distinguish between parameters or not what add a new column in myql query rules match digest support the new field adding a new engine document it
1
275,682
23,930,247,205
IssuesEvent
2022-09-10 12:36:14
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
We need a [Disruptive] test that is capable of exposing deep write latency bugs for the underlying K8s APIServer Storage implementation.
kind/bug sig/testing lifecycle/rotten needs-triage
### What happened? We missed some critical test coverage in 1.22 and verified etcd 3.5 which has several issues https://groups.google.com/a/kubernetes.io/g/dev/c/B7gJs88XtQc/m/rSgNOzV2BwAJ?utm_medium=email&utm_source=footer&pli=1 ### What did you expect to happen? We have a test in place that can push etcd to its limits and send SIGKILLs somehow. ### How can we reproduce it (as minimally and precisely as possible)? Were not 100% sure yet but this test should probably: - have a daemonset which continuosly updates annotations in an object - poll the object for changes - write the current visible state of the world back to this object with a key identifying the observing node's view of the object - confirm that over a period of 1000s of transactions, no node falls out of sync wrt the state of this object ### Anything else we need to know? For the motivation for this, see - https://github.com/etcd-io/etcd/issues/13913 - https://github.com/etcd-io/etcd/pull/13854 ### Kubernetes version 1.22 / etcd 3.5.1/2 ### Cloud provider any ### OS version any ### Install tools <details> </details> ### Container runtime (CRI) and version (if applicable) <details> </details> ### Related plugins (CNI, CSI, ...) and versions (if applicable) <details> </details>
1.0
We need a [Disruptive] test that is capable of exposing deep write latency bugs for the underlying K8s APIServer Storage implementation. - ### What happened? We missed some critical test coverage in 1.22 and verified etcd 3.5 which has several issues https://groups.google.com/a/kubernetes.io/g/dev/c/B7gJs88XtQc/m/rSgNOzV2BwAJ?utm_medium=email&utm_source=footer&pli=1 ### What did you expect to happen? We have a test in place that can push etcd to its limits and send SIGKILLs somehow. ### How can we reproduce it (as minimally and precisely as possible)? Were not 100% sure yet but this test should probably: - have a daemonset which continuosly updates annotations in an object - poll the object for changes - write the current visible state of the world back to this object with a key identifying the observing node's view of the object - confirm that over a period of 1000s of transactions, no node falls out of sync wrt the state of this object ### Anything else we need to know? For the motivation for this, see - https://github.com/etcd-io/etcd/issues/13913 - https://github.com/etcd-io/etcd/pull/13854 ### Kubernetes version 1.22 / etcd 3.5.1/2 ### Cloud provider any ### OS version any ### Install tools <details> </details> ### Container runtime (CRI) and version (if applicable) <details> </details> ### Related plugins (CNI, CSI, ...) and versions (if applicable) <details> </details>
non_process
we need a test that is capable of exposing deep write latency bugs for the underlying apiserver storage implementation what happened we missed some critical test coverage in and verified etcd which has several issues what did you expect to happen we have a test in place that can push etcd to its limits and send sigkills somehow how can we reproduce it as minimally and precisely as possible were not sure yet but this test should probably have a daemonset which continuosly updates annotations in an object poll the object for changes write the current visible state of the world back to this object with a key identifying the observing node s view of the object confirm that over a period of of transactions no node falls out of sync wrt the state of this object anything else we need to know for the motivation for this see kubernetes version etcd cloud provider any os version any install tools container runtime cri and version if applicable related plugins cni csi and versions if applicable
0
33,897
12,225,525,778
IssuesEvent
2020-05-03 05:53:10
ssobue/neo4j-demo
https://api.github.com/repos/ssobue/neo4j-demo
closed
CVE-2017-18640 (High) detected in snakeyaml-1.25.jar
security vulnerability
## CVE-2017-18640 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.25.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /tmp/ws-scm/neo4j-demo/pom.xml</p> <p>Path to vulnerable library: /tmp/ws-ua_20200503054151_PNVNXP/downloadResource_ASHMDO/20200503054209/snakeyaml-1.25.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-actuator-2.2.6.RELEASE.jar (Root Library) - spring-boot-starter-2.2.6.RELEASE.jar - :x: **snakeyaml-1.25.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ssobue/neo4j-demo/commit/fc0e5435ff6be5d71b25116f20b98e464dcb3b3c">fc0e5435ff6be5d71b25116f20b98e464dcb3b3c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Alias feature in SnakeYAML 1.18 allows entity expansion during a load operation, a related issue to CVE-2003-1564. <p>Publish Date: 2019-12-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-18640>CVE-2017-18640</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/asomov/snakeyaml/commits/da11ddbd91c1f8392ea932b37fa48110fa54ed8c">https://bitbucket.org/asomov/snakeyaml/commits/da11ddbd91c1f8392ea932b37fa48110fa54ed8c</a></p> <p>Release Date: 2020-03-08</p> <p>Fix Resolution: 1.26</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2017-18640 (High) detected in snakeyaml-1.25.jar - ## CVE-2017-18640 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.25.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /tmp/ws-scm/neo4j-demo/pom.xml</p> <p>Path to vulnerable library: /tmp/ws-ua_20200503054151_PNVNXP/downloadResource_ASHMDO/20200503054209/snakeyaml-1.25.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-actuator-2.2.6.RELEASE.jar (Root Library) - spring-boot-starter-2.2.6.RELEASE.jar - :x: **snakeyaml-1.25.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ssobue/neo4j-demo/commit/fc0e5435ff6be5d71b25116f20b98e464dcb3b3c">fc0e5435ff6be5d71b25116f20b98e464dcb3b3c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Alias feature in SnakeYAML 1.18 allows entity expansion during a load operation, a related issue to CVE-2003-1564. <p>Publish Date: 2019-12-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-18640>CVE-2017-18640</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/asomov/snakeyaml/commits/da11ddbd91c1f8392ea932b37fa48110fa54ed8c">https://bitbucket.org/asomov/snakeyaml/commits/da11ddbd91c1f8392ea932b37fa48110fa54ed8c</a></p> <p>Release Date: 2020-03-08</p> <p>Fix Resolution: 1.26</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in snakeyaml jar cve high severity vulnerability vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file tmp ws scm demo pom xml path to vulnerable library tmp ws ua pnvnxp downloadresource ashmdo snakeyaml jar dependency hierarchy spring boot starter actuator release jar root library spring boot starter release jar x snakeyaml jar vulnerable library found in head commit a href vulnerability details the alias feature in snakeyaml allows entity expansion during a load operation a related issue to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
12,838
15,223,146,750
IssuesEvent
2021-02-18 01:58:22
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Problem to generate watershed with the Fill sinks (wang & liu) comand
Bug Feedback Processing stale
Hello guys, a happy 2021 for everyone. I am trying to use the tool Fill sinks (wang & liu) to trace a watershed from an MDE.tif file.This MDE was generated from meter by meter curves at QGIS It turns out that the output files are not generated and the following error message appears: "2021-01-04T13:06:55 INFO Application state: QGIS_PREFIX_PATH env var: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr Prefix: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr Plugin Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/plugins Package Data Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/. Active Theme Name: default Active Theme Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./resources/themes\default\icons/ Default Theme Path: :/images/themes/default/ SVG Search Paths: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./svg/ C:/Users/jmlsi/AppData/Roaming/QGIS/QGIS3\profiles\default/svg/ User DB Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./resources/qgis.db Auth DB Path: C:/Users/jmlsi/AppData/Roaming/QGIS/QGIS3\profiles\default/qgis-auth.db 2021-01-04T13:07:05 WARNING Codec not found. Falling back to system locale 2021-01-04T13:07:05 WARNING Codec not found. Falling back to system locale" Could someone help me with this error? Before executing the command I indicate that I want the output files to be saved. I am using QGIS 3.10.4-A-Coruña with GDAL/OGR 3.0.4 in the windows 10-64 bits. The link for may mde is here https://drive.google.com/file/d/1BqtKKu_yr4di4oYSQVUPshdKQ2lsjghQ/view?usp=sharing or https://we.tl/t-YuBOp1tADo The link for LOG.TXT is here https://we.tl/t-70ZRXKxxAz -- -- Thanks João Marcelo Lopes
1.0
Problem to generate watershed with the Fill sinks (wang & liu) comand - Hello guys, a happy 2021 for everyone. I am trying to use the tool Fill sinks (wang & liu) to trace a watershed from an MDE.tif file.This MDE was generated from meter by meter curves at QGIS It turns out that the output files are not generated and the following error message appears: "2021-01-04T13:06:55 INFO Application state: QGIS_PREFIX_PATH env var: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr Prefix: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr Plugin Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/plugins Package Data Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/. Active Theme Name: default Active Theme Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./resources/themes\default\icons/ Default Theme Path: :/images/themes/default/ SVG Search Paths: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./svg/ C:/Users/jmlsi/AppData/Roaming/QGIS/QGIS3\profiles\default/svg/ User DB Path: C:/PROGRA~1/QGIS 3.10/apps/qgis-ltr/./resources/qgis.db Auth DB Path: C:/Users/jmlsi/AppData/Roaming/QGIS/QGIS3\profiles\default/qgis-auth.db 2021-01-04T13:07:05 WARNING Codec not found. Falling back to system locale 2021-01-04T13:07:05 WARNING Codec not found. Falling back to system locale" Could someone help me with this error? Before executing the command I indicate that I want the output files to be saved. I am using QGIS 3.10.4-A-Coruña with GDAL/OGR 3.0.4 in the windows 10-64 bits. The link for may mde is here https://drive.google.com/file/d/1BqtKKu_yr4di4oYSQVUPshdKQ2lsjghQ/view?usp=sharing or https://we.tl/t-YuBOp1tADo The link for LOG.TXT is here https://we.tl/t-70ZRXKxxAz -- -- Thanks João Marcelo Lopes
process
problem to generate watershed with the fill sinks wang liu comand hello guys a happy for everyone i am trying to use the tool fill sinks wang liu to trace a watershed from an mde tif file this mde was generated from meter by meter curves at qgis it turns out that the output files are not generated and the following error message appears info application state qgis prefix path env var c progra qgis apps qgis ltr prefix c progra qgis apps qgis ltr plugin path c progra qgis apps qgis ltr plugins package data path c progra qgis apps qgis ltr active theme name default active theme path c progra qgis apps qgis ltr resources themes default icons default theme path images themes default svg search paths c progra qgis apps qgis ltr svg c users jmlsi appdata roaming qgis profiles default svg user db path c progra qgis apps qgis ltr resources qgis db auth db path c users jmlsi appdata roaming qgis profiles default qgis auth db warning codec not found falling back to system locale warning codec not found falling back to system locale could someone help me with this error before executing the command i indicate that i want the output files to be saved i am using qgis a coruña with gdal ogr in the windows bits the link for may mde is here or the link for log txt is here thanks joão marcelo lopes
1
751,351
26,241,142,530
IssuesEvent
2023-01-05 11:36:26
Simon-Initiative/oli-torus
https://api.github.com/repos/Simon-Initiative/oli-torus
closed
Sub-objectives must be able to exist under multiple different top level objectives
Enhancement Priority_2:_High Question_/_Need_Info
Erin and Hal noted that sub-objectives must be able to exist under different top level objectives. <img width="1200" alt="Screen Shot 2022-07-14 at 10 10 58 AM" src="https://user-images.githubusercontent.com/6248894/179002669-1df21235-8be5-43db-a63b-8f712e1095d5.png">
1.0
Sub-objectives must be able to exist under multiple different top level objectives - Erin and Hal noted that sub-objectives must be able to exist under different top level objectives. <img width="1200" alt="Screen Shot 2022-07-14 at 10 10 58 AM" src="https://user-images.githubusercontent.com/6248894/179002669-1df21235-8be5-43db-a63b-8f712e1095d5.png">
non_process
sub objectives must be able to exist under multiple different top level objectives erin and hal noted that sub objectives must be able to exist under different top level objectives img width alt screen shot at am src
0
114,204
24,564,701,953
IssuesEvent
2022-10-13 01:04:18
gwhittemore-veracode/Veracode-GW-Training-demo
https://api.github.com/repos/gwhittemore-veracode/Veracode-GW-Training-demo
opened
CVE: 2017-3586 found in MySQL Connector/J - Version: 5.1.35 [JAVA]
Severity: Medium Veracode Dependency Scanning
Veracode Software Composition Analysis =============================== Attribute | Details | --- | --- | Library | MySQL Connector/J Description | JDBC Type 4 driver for MySQL Language | JAVA Vulnerability | Usable Expired Certificates Vulnerability description | mysql-connector-java doesn't check the server's SSL certificate for an expiration date before it establishes the SSL connection. This would allow attackers to use an expired certificate to make requests to the server. CVE | 2017-3586 CVSS score | 5.5 Vulnerability present in version/s | 5.1.21-5.1.41 Found library version/s | 5.1.35 Vulnerability fixed in version | 5.1.42 Library latest version | 8.0.30 Fix | Links: - https://sca.analysiscenter.veracode.com/vulnerability-database/libraries/1834?version=5.1.35 - https://sca.analysiscenter.veracode.com/vulnerability-database/vulnerabilities/3962 - Patch: https://github.com/mysql/mysql-connector-j/commit/aeba57264966b0fd329cdb8170ba772fd8fd4de2
1.0
CVE: 2017-3586 found in MySQL Connector/J - Version: 5.1.35 [JAVA] - Veracode Software Composition Analysis =============================== Attribute | Details | --- | --- | Library | MySQL Connector/J Description | JDBC Type 4 driver for MySQL Language | JAVA Vulnerability | Usable Expired Certificates Vulnerability description | mysql-connector-java doesn't check the server's SSL certificate for an expiration date before it establishes the SSL connection. This would allow attackers to use an expired certificate to make requests to the server. CVE | 2017-3586 CVSS score | 5.5 Vulnerability present in version/s | 5.1.21-5.1.41 Found library version/s | 5.1.35 Vulnerability fixed in version | 5.1.42 Library latest version | 8.0.30 Fix | Links: - https://sca.analysiscenter.veracode.com/vulnerability-database/libraries/1834?version=5.1.35 - https://sca.analysiscenter.veracode.com/vulnerability-database/vulnerabilities/3962 - Patch: https://github.com/mysql/mysql-connector-j/commit/aeba57264966b0fd329cdb8170ba772fd8fd4de2
non_process
cve found in mysql connector j version veracode software composition analysis attribute details library mysql connector j description jdbc type driver for mysql language java vulnerability usable expired certificates vulnerability description mysql connector java doesn t check the server s ssl certificate for an expiration date before it establishes the ssl connection this would allow attackers to use an expired certificate to make requests to the server cve cvss score vulnerability present in version s found library version s vulnerability fixed in version library latest version fix links patch
0
146,997
13,198,545,945
IssuesEvent
2020-08-14 02:50:49
brewcast-fm/brewcast
https://api.github.com/repos/brewcast-fm/brewcast
closed
Choose a License
documentation
An open-source license guarantees that others can use, copy, modify, and contribute back to your project without repercussions. It also protects you from sticky legal situations. We must include a license to launch brewcast as an open-source project.
1.0
Choose a License - An open-source license guarantees that others can use, copy, modify, and contribute back to your project without repercussions. It also protects you from sticky legal situations. We must include a license to launch brewcast as an open-source project.
non_process
choose a license an open source license guarantees that others can use copy modify and contribute back to your project without repercussions it also protects you from sticky legal situations we must include a license to launch brewcast as an open source project
0