Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
16,134
9,695,410,044
IssuesEvent
2019-05-24 22:23:29
NixOS/nixpkgs
https://api.github.com/repos/NixOS/nixpkgs
closed
prosody stores passwords in plain text
1.severity: security 6.topic: nixos 9.needs: module (update)
## Issue description The prosody module does not have an option to set the `authentication` config. Prosody stores passwords in plain text by default. An option for the `authentication` config should be added with default value `"internal_hashed"`. To work around this, I added ``` extraConfig = '' authentication = "internal_hashed" ''; ``` to my config ### Steps to reproduce Set up prosody, register, check the user file in the data path. It will contain the password in plain text.
True
prosody stores passwords in plain text - ## Issue description The prosody module does not have an option to set the `authentication` config. Prosody stores passwords in plain text by default. An option for the `authentication` config should be added with default value `"internal_hashed"`. To work around this, I added ``` extraConfig = '' authentication = "internal_hashed" ''; ``` to my config ### Steps to reproduce Set up prosody, register, check the user file in the data path. It will contain the password in plain text.
non_process
prosody stores passwords in plain text issue description the prosody module does not have an option to set the authentication config prosody stores passwords in plain text by default an option for the authentication config should be added with default value internal hashed to work around this i added extraconfig authentication internal hashed to my config steps to reproduce set up prosody register check the user file in the data path it will contain the password in plain text
0
1,835
27,016,945,987
IssuesEvent
2023-02-10 20:21:55
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
[Portable Dashboards] Build API Examples
Feature:Dashboard Team:Presentation loe:week impact:high Project:Portable Dashboard
### Why do we need examples? The Portable Dashboards implementation will soon be merged, and with that implementation comes one example of how to use the portable Dashboard, the Dashboards app. When looking at this from the POV of a potential consumer, using the Dashboard app as an example comes with the following limitations. 1. The Dashboard App uses **every** feature of the portable Dashboard. There are no features currently created for solution use-cases, and there are no examples of what a more pared down version of the Portable Dashboard could look like. 2. The Dashboard App and the Dashboard Container are in the same plugin. How do we consume this externally? 3. The API isn't quite documented at the moment. To answer these, as well as any unknown unknowns, we should build a Portable Dashboards Examples plugin. ### Examples of examples? This portable dashboards example plugin could potentially contain examples of: - A totally empty portable dashboard in edit mode. - A portable dashboard in view mode used to show two or more pre-configured Lens charts. (pass in a hardcoded time range here for the Lens panels to show) - A portable dashboard with a Unified Search integration which contains a test embeddable that prints out its input as JSON (credit to @nreese for this idea) - A portable dashboard with a controls integration, which uses the Controls factory (could be exposed publicly from the controls plugin, builder pattern described in https://github.com/elastic/kibana/issues/145429) to build out a hardcoded set of Controls. This could also contain a test embeddable. - _Optionally_ - Two portable dashboards side by side, both with controls implementations and Lens embeddables, demonstrating how each portable dashboard contains its own state instance. These are just some ideas, any other examples are welcome! In this process, it is likely that things will need to change on the Dashboard side, which is all fair game! ### Documentation Additionally, the PR that closes this issue could write more detailed API documentation, and explanations above each example
True
[Portable Dashboards] Build API Examples - ### Why do we need examples? The Portable Dashboards implementation will soon be merged, and with that implementation comes one example of how to use the portable Dashboard, the Dashboards app. When looking at this from the POV of a potential consumer, using the Dashboard app as an example comes with the following limitations. 1. The Dashboard App uses **every** feature of the portable Dashboard. There are no features currently created for solution use-cases, and there are no examples of what a more pared down version of the Portable Dashboard could look like. 2. The Dashboard App and the Dashboard Container are in the same plugin. How do we consume this externally? 3. The API isn't quite documented at the moment. To answer these, as well as any unknown unknowns, we should build a Portable Dashboards Examples plugin. ### Examples of examples? This portable dashboards example plugin could potentially contain examples of: - A totally empty portable dashboard in edit mode. - A portable dashboard in view mode used to show two or more pre-configured Lens charts. (pass in a hardcoded time range here for the Lens panels to show) - A portable dashboard with a Unified Search integration which contains a test embeddable that prints out its input as JSON (credit to @nreese for this idea) - A portable dashboard with a controls integration, which uses the Controls factory (could be exposed publicly from the controls plugin, builder pattern described in https://github.com/elastic/kibana/issues/145429) to build out a hardcoded set of Controls. This could also contain a test embeddable. - _Optionally_ - Two portable dashboards side by side, both with controls implementations and Lens embeddables, demonstrating how each portable dashboard contains its own state instance. These are just some ideas, any other examples are welcome! In this process, it is likely that things will need to change on the Dashboard side, which is all fair game! ### Documentation Additionally, the PR that closes this issue could write more detailed API documentation, and explanations above each example
non_process
build api examples why do we need examples the portable dashboards implementation will soon be merged and with that implementation comes one example of how to use the portable dashboard the dashboards app when looking at this from the pov of a potential consumer using the dashboard app as an example comes with the following limitations the dashboard app uses every feature of the portable dashboard there are no features currently created for solution use cases and there are no examples of what a more pared down version of the portable dashboard could look like the dashboard app and the dashboard container are in the same plugin how do we consume this externally the api isn t quite documented at the moment to answer these as well as any unknown unknowns we should build a portable dashboards examples plugin examples of examples this portable dashboards example plugin could potentially contain examples of a totally empty portable dashboard in edit mode a portable dashboard in view mode used to show two or more pre configured lens charts pass in a hardcoded time range here for the lens panels to show a portable dashboard with a unified search integration which contains a test embeddable that prints out its input as json credit to nreese for this idea a portable dashboard with a controls integration which uses the controls factory could be exposed publicly from the controls plugin builder pattern described in to build out a hardcoded set of controls this could also contain a test embeddable optionally two portable dashboards side by side both with controls implementations and lens embeddables demonstrating how each portable dashboard contains its own state instance these are just some ideas any other examples are welcome in this process it is likely that things will need to change on the dashboard side which is all fair game documentation additionally the pr that closes this issue could write more detailed api documentation and explanations above each example
0
33,027
2,761,502,213
IssuesEvent
2015-04-28 17:34:47
metapolator/metapolator
https://api.github.com/repos/metapolator/metapolator
closed
Specimens: Refactor the sample specimen texts
enhancement Priority Medium UI
The sample specimen texts can be improved. They live here: https://github.com/metapolator/metapolator/blob/gh-pages/purple-pill/js/controllers/specimenController.js#L18-L45 - [ ] Note in src that it is an array of arrays, to make separators
1.0
Specimens: Refactor the sample specimen texts - The sample specimen texts can be improved. They live here: https://github.com/metapolator/metapolator/blob/gh-pages/purple-pill/js/controllers/specimenController.js#L18-L45 - [ ] Note in src that it is an array of arrays, to make separators
non_process
specimens refactor the sample specimen texts the sample specimen texts can be improved they live here note in src that it is an array of arrays to make separators
0
40,641
5,244,887,098
IssuesEvent
2017-02-01 01:18:54
chihaya/chihaya
https://api.github.com/repos/chihaya/chihaya
closed
net.IP wrong usage
component/frontend/http component/frontend/udp component/middleware kind/design
Hi I've noticed the code heavily rely on the length of `net.IP` variables to differentiate IPv4 and IPv6 protocols, and this is a mistake. example https://github.com/chihaya/chihaya/blob/master/middleware/hooks.go#L126 Per the doc : https://golang.org/pkg/net/#IP > Note that in this documentation, referring to an IP address as an IPv4 address or an IPv6 address is a semantic property of the address, not just the length of the byte slice: a 16-byte slice can still be an IPv4 address. Instead i suggest using ```go func IsIPv4(ip net.IP) bool { return ip.to4() != nil } ``` and ```go // According to the documentation, 16-bytes addresses can // be either protocol, but an IP returned by net.IP.To4() will // always be an IPv4, so if the result is nil we know it is an IPv6. func IsIPv6(ip net.IP) bool { return len(ip) == net.IPv6len && ip.To4() == nil } ```
1.0
net.IP wrong usage - Hi I've noticed the code heavily rely on the length of `net.IP` variables to differentiate IPv4 and IPv6 protocols, and this is a mistake. example https://github.com/chihaya/chihaya/blob/master/middleware/hooks.go#L126 Per the doc : https://golang.org/pkg/net/#IP > Note that in this documentation, referring to an IP address as an IPv4 address or an IPv6 address is a semantic property of the address, not just the length of the byte slice: a 16-byte slice can still be an IPv4 address. Instead i suggest using ```go func IsIPv4(ip net.IP) bool { return ip.to4() != nil } ``` and ```go // According to the documentation, 16-bytes addresses can // be either protocol, but an IP returned by net.IP.To4() will // always be an IPv4, so if the result is nil we know it is an IPv6. func IsIPv6(ip net.IP) bool { return len(ip) == net.IPv6len && ip.To4() == nil } ```
non_process
net ip wrong usage hi i ve noticed the code heavily rely on the length of net ip variables to differentiate and protocols and this is a mistake example per the doc note that in this documentation referring to an ip address as an address or an address is a semantic property of the address not just the length of the byte slice a byte slice can still be an address instead i suggest using go func ip net ip bool return ip nil and go according to the documentation bytes addresses can be either protocol but an ip returned by net ip will always be an so if the result is nil we know it is an func ip net ip bool return len ip net ip nil
0
258,352
8,169,611,658
IssuesEvent
2018-08-27 02:39:53
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
closed
fault during my timer testing
bug priority: medium
hello I found a fault during my timer testing. zephyr version: 1.9.1 code: ``` _struct k_timer test_timer, test_timer2; static void test_timeout_event(os_timer *timer) { } static void test2_timeout_event(os_timer *timer) { k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)); } void test_timer(void) { k_timer_init(&test_timer, test_timeout_event, NULL); k_timer_init(&test_timer2, test2_timeout_event, NULL); k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)); while(1) { k_timer_start(&test_timer2, K_MSEC(100), 0); k_sleep(K_MSEC(1000)); } } ``` analysis: when timer1 & timer2 expired in the same tick, timer1 & timer2 will be dequeue from _timeout_q to expired. In timer2 callback function, k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)) will re-insert timer1 to _timeout_q. After timer2 callback function, the expired sys_dlist(in _handle_expired_timeouts()) has changed. The callback of timer linked in the _timeout_q will be called in order. when run last timeout(_timeout_q which actually is not a timer structure),run timeout->func will trigger a fault.
1.0
fault during my timer testing - hello I found a fault during my timer testing. zephyr version: 1.9.1 code: ``` _struct k_timer test_timer, test_timer2; static void test_timeout_event(os_timer *timer) { } static void test2_timeout_event(os_timer *timer) { k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)); } void test_timer(void) { k_timer_init(&test_timer, test_timeout_event, NULL); k_timer_init(&test_timer2, test2_timeout_event, NULL); k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)); while(1) { k_timer_start(&test_timer2, K_MSEC(100), 0); k_sleep(K_MSEC(1000)); } } ``` analysis: when timer1 & timer2 expired in the same tick, timer1 & timer2 will be dequeue from _timeout_q to expired. In timer2 callback function, k_timer_start(&test_timer, K_MSEC(10), K_MSEC(20)) will re-insert timer1 to _timeout_q. After timer2 callback function, the expired sys_dlist(in _handle_expired_timeouts()) has changed. The callback of timer linked in the _timeout_q will be called in order. when run last timeout(_timeout_q which actually is not a timer structure),run timeout->func will trigger a fault.
non_process
fault during my timer testing hello i found a fault during my timer testing zephyr version code struct k timer test timer test static void test timeout event os timer timer static void timeout event os timer timer k timer start test timer k msec k msec void test timer void k timer init test timer test timeout event null k timer init test timeout event null k timer start test timer k msec k msec while k timer start test k msec k sleep k msec analysis when expired in the same tick will be dequeue from timeout q to expired in callback function k timer start test timer k msec k msec will re insert to timeout q after callback function the expired sys dlist in handle expired timeouts has changed the callback of timer linked in the timeout q will be called in order when run last timeout timeout q which actually is not a timer structure ,run timeout func will trigger a fault
0
60,790
17,023,522,706
IssuesEvent
2021-07-03 02:27:38
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
XML not well formed in response to a reverse request (city district node)
Component: nominatim Priority: major Resolution: fixed Type: defect
**[Submitted to the original trac issue database at 10.52am, Friday, 4th December 2009]** When the City District node is included in the response XML object this node appears with the name *<city district>''. If you run a parser on this object you encounter an error because the parser understands that the word ''district'' is an attribute without a value. It should be ''<city_district>'' as in ''<country_code>'' or ''<citydistrict>* without a space between the two words.
1.0
XML not well formed in response to a reverse request (city district node) - **[Submitted to the original trac issue database at 10.52am, Friday, 4th December 2009]** When the City District node is included in the response XML object this node appears with the name *<city district>''. If you run a parser on this object you encounter an error because the parser understands that the word ''district'' is an attribute without a value. It should be ''<city_district>'' as in ''<country_code>'' or ''<citydistrict>* without a space between the two words.
non_process
xml not well formed in response to a reverse request city district node when the city district node is included in the response xml object this node appears with the name if you run a parser on this object you encounter an error because the parser understands that the word district is an attribute without a value it should be as in or without a space between the two words
0
2,443
5,220,657,532
IssuesEvent
2017-01-26 22:30:12
vuejs/vue-loader
https://api.github.com/repos/vuejs/vue-loader
closed
vue文件使用多个loader时不能正常工作
pre-processor
以下是我写的一个简单的loader,用来将vue文件里面的L("aaaas")替代成L("aaaas","talk")样式 在webpack里面是这样配置的: ` { test: /\.vue$/, loader: 'vue-loader!lang-loader' } ` 这个加载器的作用是用来进行处理多语言时使用的。 但我发现lang-loader可以正常运行,在console.log跟踪也可以正常进行字符串的替换的 但是结果输出时并没有,好象vue-loader只对原始文件进行处理,而不是在上一个输入的基础上 处理。 加载器的代码如下: ` //本加载器用来专门处理多语言文件中的Region问题 // 将文件里的翻译函数注入Region变量 // 如 L("要翻译的内容") // 如果上述语言定义在一个叫iptalk的翻译域中, // 则将其转换成L("要翻译的内容","iptalk") // var langConfig = require("../src/language/language.config.json") var loaderUtils = require('loader-utils'); var path = require("path") // L("dfsdfds") var langRegExp = new RegExp(/\bL\(\"(.*?)\"\)/g) //根据文件名称返回其所在的Region function getfileRegion(srcFile){ var resultRegion="main" for(region in langConfig.regions || {}){ var regPath=path.resolve(__dirname,"../src",langConfig.regions[region]) var ref=path.relative(regPath,srcFile) if(!(ref==srcFile || ref.slice(0,2)=="..")){ resultRegion=region break; } } return resultRegion } module.exports = function(source) { var query = loaderUtils.parseQuery(this.query); var region=getfileRegion(this.resourcePath) if(region!="main"){ source=source.replace(langRegExp,'L("$1","' + region + '")') } return source }; `
1.0
vue文件使用多个loader时不能正常工作 - 以下是我写的一个简单的loader,用来将vue文件里面的L("aaaas")替代成L("aaaas","talk")样式 在webpack里面是这样配置的: ` { test: /\.vue$/, loader: 'vue-loader!lang-loader' } ` 这个加载器的作用是用来进行处理多语言时使用的。 但我发现lang-loader可以正常运行,在console.log跟踪也可以正常进行字符串的替换的 但是结果输出时并没有,好象vue-loader只对原始文件进行处理,而不是在上一个输入的基础上 处理。 加载器的代码如下: ` //本加载器用来专门处理多语言文件中的Region问题 // 将文件里的翻译函数注入Region变量 // 如 L("要翻译的内容") // 如果上述语言定义在一个叫iptalk的翻译域中, // 则将其转换成L("要翻译的内容","iptalk") // var langConfig = require("../src/language/language.config.json") var loaderUtils = require('loader-utils'); var path = require("path") // L("dfsdfds") var langRegExp = new RegExp(/\bL\(\"(.*?)\"\)/g) //根据文件名称返回其所在的Region function getfileRegion(srcFile){ var resultRegion="main" for(region in langConfig.regions || {}){ var regPath=path.resolve(__dirname,"../src",langConfig.regions[region]) var ref=path.relative(regPath,srcFile) if(!(ref==srcFile || ref.slice(0,2)=="..")){ resultRegion=region break; } } return resultRegion } module.exports = function(source) { var query = loaderUtils.parseQuery(this.query); var region=getfileRegion(this.resourcePath) if(region!="main"){ source=source.replace(langRegExp,'L("$1","' + region + '")') } return source }; `
process
vue文件使用多个loader时不能正常工作 以下是我写的一个简单的loader,用来将vue文件里面的l aaaas 替代成l aaaas talk 样式 在webpack里面是这样配置的 test vue loader vue loader lang loader 这个加载器的作用是用来进行处理多语言时使用的。 但我发现lang loader可以正常运行,在console log跟踪也可以正常进行字符串的替换的 但是结果输出时并没有,好象vue loader只对原始文件进行处理,而不是在上一个输入的基础上 处理。 加载器的代码如下: 本加载器用来专门处理多语言文件中的region问题 将文件里的翻译函数注入region变量 如 l 要翻译的内容 如果上述语言定义在一个叫iptalk的翻译域中, 则将其转换成l 要翻译的内容 iptalk var langconfig require src language language config json var loaderutils require loader utils var path require path l dfsdfds var langregexp new regexp bl g 根据文件名称返回其所在的region function getfileregion srcfile var resultregion main for region in langconfig regions var regpath path resolve dirname src langconfig regions var ref path relative regpath srcfile if ref srcfile ref slice resultregion region break return resultregion module exports function source var query loaderutils parsequery this query var region getfileregion this resourcepath if region main source source replace langregexp l region return source
1
17,487
23,302,285,420
IssuesEvent
2022-08-07 13:59:43
Battle-s/battle-school-backend
https://api.github.com/repos/Battle-s/battle-school-backend
opened
구글 공동 계정 및 AWS 계정 생성 후 인프라 세팅
setting :hammer: processing :hourglass_flowing_sand:
## 설명 > 이슈에 대한 설명을 작성합니다. 담당자도 함께 작성하면 좋습니다. ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] 구글 공동 계정 - [ ] AWS 공동 계정 - [ ] ec2 & rds 세팅 ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
1.0
구글 공동 계정 및 AWS 계정 생성 후 인프라 세팅 - ## 설명 > 이슈에 대한 설명을 작성합니다. 담당자도 함께 작성하면 좋습니다. ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] 구글 공동 계정 - [ ] AWS 공동 계정 - [ ] ec2 & rds 세팅 ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
process
구글 공동 계정 및 aws 계정 생성 후 인프라 세팅 설명 이슈에 대한 설명을 작성합니다 담당자도 함께 작성하면 좋습니다 체크사항 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다 구글 공동 계정 aws 공동 계정 rds 세팅 참고자료 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다 관련 논의 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다
1
10,639
13,446,133,690
IssuesEvent
2020-09-08 12:31:31
MHRA/products
https://api.github.com/repos/MHRA/products
closed
PARs - Update an existing PAR
EPIC - PARs process HIGH PRIORITY :arrow_double_up: STORY :book:
### User want As a Medical Writer in the licensing team I would like to upload a new version of an existing PAR so that the latest version is available on products.mhra.gov.uk (Linked to #397 and #398) ## Acceptance Criteria ### Customer acceptance criteria - [ ] Medical writers can find an existing PAR which they want to amend - [ ] Medical writers can upload a new version of that file, which will then be surfaced on products.mhra.gov.uk - [ ] Medical writers can input the new PAR information into form fields, which will then be surfaced on products.mhra.gov.uk - [ ] The PAR can be linked to one or multiple products (which will have their own PL number) - [ ] PAR has product name for each product - [ ] PAR has active substances for each product - [ ] PAR has a PLs / NR / THR for each product - [ ] Medical writers must be logged in to the site with their MHRA account to upload and will be blocked if they aren't - [ ] Medical writers who are not part of the [TBD - fill in when known] domain group, cannot upload - [ ] Before submitting, the user is shown a summary of what they are submitting - [ ] After submitting, the user is presented with: Author, date and time of submission. ### Technical acceptance criteria - [ ] PAR pdf is in blob storage after upload - [ ] PAR metadata is attached - [ ] PAR is in the search index - [ ] blob/metadata/search index is handled by doc-index-updater ### Data acceptance criteria ### Testing acceptance criteria **Size** XL **Value** **Effort** ### Exit Criteria met - [x] Backlog - [x] Discovery - [x] DUXD - [ ] Development - [ ] Quality Assurance - [ ] Release and Validate
1.0
PARs - Update an existing PAR - ### User want As a Medical Writer in the licensing team I would like to upload a new version of an existing PAR so that the latest version is available on products.mhra.gov.uk (Linked to #397 and #398) ## Acceptance Criteria ### Customer acceptance criteria - [ ] Medical writers can find an existing PAR which they want to amend - [ ] Medical writers can upload a new version of that file, which will then be surfaced on products.mhra.gov.uk - [ ] Medical writers can input the new PAR information into form fields, which will then be surfaced on products.mhra.gov.uk - [ ] The PAR can be linked to one or multiple products (which will have their own PL number) - [ ] PAR has product name for each product - [ ] PAR has active substances for each product - [ ] PAR has a PLs / NR / THR for each product - [ ] Medical writers must be logged in to the site with their MHRA account to upload and will be blocked if they aren't - [ ] Medical writers who are not part of the [TBD - fill in when known] domain group, cannot upload - [ ] Before submitting, the user is shown a summary of what they are submitting - [ ] After submitting, the user is presented with: Author, date and time of submission. ### Technical acceptance criteria - [ ] PAR pdf is in blob storage after upload - [ ] PAR metadata is attached - [ ] PAR is in the search index - [ ] blob/metadata/search index is handled by doc-index-updater ### Data acceptance criteria ### Testing acceptance criteria **Size** XL **Value** **Effort** ### Exit Criteria met - [x] Backlog - [x] Discovery - [x] DUXD - [ ] Development - [ ] Quality Assurance - [ ] Release and Validate
process
pars update an existing par user want as a medical writer in the licensing team i would like to upload a new version of an existing par so that the latest version is available on products mhra gov uk linked to and acceptance criteria customer acceptance criteria medical writers can find an existing par which they want to amend medical writers can upload a new version of that file which will then be surfaced on products mhra gov uk medical writers can input the new par information into form fields which will then be surfaced on products mhra gov uk the par can be linked to one or multiple products which will have their own pl number par has product name for each product par has active substances for each product par has a pls nr thr for each product medical writers must be logged in to the site with their mhra account to upload and will be blocked if they aren t medical writers who are not part of the domain group cannot upload before submitting the user is shown a summary of what they are submitting after submitting the user is presented with author date and time of submission technical acceptance criteria par pdf is in blob storage after upload par metadata is attached par is in the search index blob metadata search index is handled by doc index updater data acceptance criteria testing acceptance criteria size xl value effort exit criteria met backlog discovery duxd development quality assurance release and validate
1
10,798
13,609,285,992
IssuesEvent
2020-09-23 04:49:50
googleapis/java-bigtable
https://api.github.com/repos/googleapis/java-bigtable
closed
Dependency Dashboard
api: bigtable type: process
This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-conformance-tests-0.x -->deps: update dependency com.google.cloud:google-cloud-conformance-tests to v0.0.12 --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-conformance-tests-0.x -->deps: update dependency com.google.cloud:google-cloud-conformance-tests to v0.0.12 --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any deps update dependency com google cloud google cloud conformance tests to check this box to trigger a request for renovate to run again on this repository
1
510,105
14,784,996,341
IssuesEvent
2021-01-12 01:36:48
NCAR/GeoCAT
https://api.github.com/repos/NCAR/GeoCAT
closed
fail to install GeoCAT-comp on windows 10
bug enhancement medium priority support
Hi, I had succeed installed miniconda, but when input the command “conda create -n geocat -c conda-forge -c ncar geocat-comp”,it's show a error,the error information is: Collecting package metadata (current_repodata.json): done Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source. Collecting package metadata (repodata.json): done Solving environment: | Found conflicts! Looking for incompatible packages. This can take several minutes. Press CTRL-C to abort. Examining @/win-64::__cuda==10.2=0: 50%|████████████████▌ | 1/2 [00:00<0Examining @/win-64::__cuda==10.2=0: 100%|████████████████████████████████failed UnsatisfiableError: The following specifications were found to be incompatible with your CUDA driver: - feature:/win-64::__cuda==10.2=0 Your installed CUDA driver is: 10.2
1.0
fail to install GeoCAT-comp on windows 10 - Hi, I had succeed installed miniconda, but when input the command “conda create -n geocat -c conda-forge -c ncar geocat-comp”,it's show a error,the error information is: Collecting package metadata (current_repodata.json): done Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source. Collecting package metadata (repodata.json): done Solving environment: | Found conflicts! Looking for incompatible packages. This can take several minutes. Press CTRL-C to abort. Examining @/win-64::__cuda==10.2=0: 50%|████████████████▌ | 1/2 [00:00<0Examining @/win-64::__cuda==10.2=0: 100%|████████████████████████████████failed UnsatisfiableError: The following specifications were found to be incompatible with your CUDA driver: - feature:/win-64::__cuda==10.2=0 Your installed CUDA driver is: 10.2
non_process
fail to install geocat comp on windows hi i had succeed installed miniconda but when input the command “conda create n geocat c conda forge c ncar geocat comp”,it s show a error,the error information is collecting package metadata current repodata json done solving environment failed with repodata from current repodata json will retry with next repodata source collecting package metadata repodata json done solving environment found conflicts looking for incompatible packages this can take several minutes press ctrl c to abort examining win cuda ████████████████▌ win cuda ████████████████████████████████failed unsatisfiableerror the following specifications were found to be incompatible with your cuda driver feature win cuda your installed cuda driver is
0
814,383
30,505,514,840
IssuesEvent
2023-07-18 16:35:01
Automattic/woocommerce-payments
https://api.github.com/repos/Automattic/woocommerce-payments
closed
WooPay shoppers blocked from order success page by a security wall
type: bug priority: high impact: high component: WooPay GA Blocker
Actual: An authenticated WooPay shopper, on a shortcode checkout site, when successfully placing an order is frequently presented with a security screen and must either login to the site or verify their email address to access the order thank you page. Expectation: Under no circumstance should a WooPay user who places an order be presented with a security screen in order to access the thank you page on an order they just placed. ![image](https://github.com/Automattic/woocommerce-payments/assets/109928953/0538c419-95f4-41d5-a3ba-49adc2f787b8) Cross posted to https://github.com/Automattic/woopay/issues/1951, the primary ticket for this issue. Duplicated here for extra visibility and urgency.
1.0
WooPay shoppers blocked from order success page by a security wall - Actual: An authenticated WooPay shopper, on a shortcode checkout site, when successfully placing an order is frequently presented with a security screen and must either login to the site or verify their email address to access the order thank you page. Expectation: Under no circumstance should a WooPay user who places an order be presented with a security screen in order to access the thank you page on an order they just placed. ![image](https://github.com/Automattic/woocommerce-payments/assets/109928953/0538c419-95f4-41d5-a3ba-49adc2f787b8) Cross posted to https://github.com/Automattic/woopay/issues/1951, the primary ticket for this issue. Duplicated here for extra visibility and urgency.
non_process
woopay shoppers blocked from order success page by a security wall actual an authenticated woopay shopper on a shortcode checkout site when successfully placing an order is frequently presented with a security screen and must either login to the site or verify their email address to access the order thank you page expectation under no circumstance should a woopay user who places an order be presented with a security screen in order to access the thank you page on an order they just placed cross posted to the primary ticket for this issue duplicated here for extra visibility and urgency
0
21,446
29,478,711,157
IssuesEvent
2023-06-02 02:14:10
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
Update cypress docker images automatically on release
external: docker process: release stage: ready for work type: user experience stale
<!-- Is this a question? Don't open an issue. Ask in our chat https://on.cypress.io/chat --> ### Current behavior: Cypress docker images are behind the cypress release See the tags for `cypress/included`: https://hub.docker.com/r/cypress/included/tags Currently the latest is `3.7.0`, but `3.8.0` was released yesterday <!-- images, stack traces, etc --> ### Desired behavior: An automated build should be setup to build the new docker images as soon as a release is cut <!-- A clear concise description of what you want to happen --> <!-- ### Steps to reproduce: (app code and test code) <!-- Issues without reproducible steps WILL BE CLOSED --> <!-- You can fork https://github.com/cypress-io/cypress-test-tiny repo, set up a failing test, then tell us the repo/branch to try. --> ### Versions 3.8.0 <!-- Cypress, operating system, browser -->
1.0
Update cypress docker images automatically on release - <!-- Is this a question? Don't open an issue. Ask in our chat https://on.cypress.io/chat --> ### Current behavior: Cypress docker images are behind the cypress release See the tags for `cypress/included`: https://hub.docker.com/r/cypress/included/tags Currently the latest is `3.7.0`, but `3.8.0` was released yesterday <!-- images, stack traces, etc --> ### Desired behavior: An automated build should be setup to build the new docker images as soon as a release is cut <!-- A clear concise description of what you want to happen --> <!-- ### Steps to reproduce: (app code and test code) <!-- Issues without reproducible steps WILL BE CLOSED --> <!-- You can fork https://github.com/cypress-io/cypress-test-tiny repo, set up a failing test, then tell us the repo/branch to try. --> ### Versions 3.8.0 <!-- Cypress, operating system, browser -->
process
update cypress docker images automatically on release current behavior cypress docker images are behind the cypress release see the tags for cypress included currently the latest is but was released yesterday desired behavior an automated build should be setup to build the new docker images as soon as a release is cut steps to reproduce app code and test code versions
1
38
2,507,191,748
IssuesEvent
2015-01-12 16:41:03
GsDevKit/gsApplicationTools
https://api.github.com/repos/GsDevKit/gsApplicationTools
closed
GemServer>>doBasicTransaction: must be non-re-entrant
in process
After a discussion with @rjsargent, I've decided that the conflicting goals of 1. running production applications in `manual transaction mode`. 2. allowing folks to debug *gem servers* in `automatic transaction mode`. 3. allowing GemServer>>doBasicTransaction: to be re-entrant. cannot be achieved cleanly. The basic problem is that it isn't possible to tell when it is **correct** to abort/commit when running in `automatic transaction mode`. Of course, I also don't have a strong case fr needing re-entrant GemServer>>doBasicTransaction: calls, so for now they will be non-re-entrant.
1.0
GemServer>>doBasicTransaction: must be non-re-entrant - After a discussion with @rjsargent, I've decided that the conflicting goals of 1. running production applications in `manual transaction mode`. 2. allowing folks to debug *gem servers* in `automatic transaction mode`. 3. allowing GemServer>>doBasicTransaction: to be re-entrant. cannot be achieved cleanly. The basic problem is that it isn't possible to tell when it is **correct** to abort/commit when running in `automatic transaction mode`. Of course, I also don't have a strong case fr needing re-entrant GemServer>>doBasicTransaction: calls, so for now they will be non-re-entrant.
process
gemserver dobasictransaction must be non re entrant after a discussion with rjsargent i ve decided that the conflicting goals of running production applications in manual transaction mode allowing folks to debug gem servers in automatic transaction mode allowing gemserver dobasictransaction to be re entrant cannot be achieved cleanly the basic problem is that it isn t possible to tell when it is correct to abort commit when running in automatic transaction mode of course i also don t have a strong case fr needing re entrant gemserver dobasictransaction calls so for now they will be non re entrant
1
797
3,275,514,340
IssuesEvent
2015-10-26 15:51:13
grafeo/grafeo
https://api.github.com/repos/grafeo/grafeo
opened
Drawing Functions: Line
Component: Image Processing priority: high
- Input Array - Point1 - Point2 - Color - Thickness - LineType: 4-neighbor, 8-neighbor or antialiased - Shift: Fractional shift bits
1.0
Drawing Functions: Line - - Input Array - Point1 - Point2 - Color - Thickness - LineType: 4-neighbor, 8-neighbor or antialiased - Shift: Fractional shift bits
process
drawing functions line input array color thickness linetype neighbor neighbor or antialiased shift fractional shift bits
1
69,313
14,988,006,185
IssuesEvent
2021-01-29 00:11:35
oragono/oragono
https://api.github.com/repos/oragono/oragono
opened
account verification flows allowing a captcha?
help wanted security weird fun stuff
It's already possible to implement email and captcha verification externally: build a webapp that can dispatch verification emails, display and verify a captcha, then finally SAREGISTER an account with the ircd on success. But we might want to have better code support for it within Oragono itself. For example, we could let people initiate the registration process in-band, then the external webapp would check the captcha and do a `SAVERIFY` or something. Or we could serve captchas straight out of Oragono itself.
True
account verification flows allowing a captcha? - It's already possible to implement email and captcha verification externally: build a webapp that can dispatch verification emails, display and verify a captcha, then finally SAREGISTER an account with the ircd on success. But we might want to have better code support for it within Oragono itself. For example, we could let people initiate the registration process in-band, then the external webapp would check the captcha and do a `SAVERIFY` or something. Or we could serve captchas straight out of Oragono itself.
non_process
account verification flows allowing a captcha it s already possible to implement email and captcha verification externally build a webapp that can dispatch verification emails display and verify a captcha then finally saregister an account with the ircd on success but we might want to have better code support for it within oragono itself for example we could let people initiate the registration process in band then the external webapp would check the captcha and do a saverify or something or we could serve captchas straight out of oragono itself
0
28,966
4,454,742,602
IssuesEvent
2016-08-23 02:33:59
mautic/mautic
https://api.github.com/repos/mautic/mautic
closed
Failed e-mail appears as sent in contact timeline
Bug Ready To Test
What type of report is this: | Q | A | ---| --- | Bug report? | yes | Feature request? | | Enhancement? | ## Description: Failed e-mails to a contact appear in the timeline as sent. I encountered that behaviour as I was sending a manual e-mail to a contact. I made the test with Amazon SES where you have to configure your allowed sending domains. The SMTP was dropping my mails and I think Mautic does not handle that error correctly. ## If a bug: | Q | A | --- | --- | Mautic version | 2.1.0 | PHP version | 5.6.23-1+deprecated+dontuse+deb.sury.org~trusty+1 ### Steps to reproduce: 1. Configure a mail transport that will reject certain sending domains (like Amazon SES) 1. Send a manual e-mail to that contact from the contact detail page 1. Process the mail queue from the cli (php app/console mautic:emails:send) 1. Check the contact page. The lead is marked as bounced (which is wrong, because the sending domain was wrong, not the contact). The email sent to the customer shows up as "sent", which give you the feeling that the contact received it. ![screen shot 2016-08-18 at 09 46 09](https://cloud.githubusercontent.com/assets/1575445/17766098/aec454d0-6529-11e6-810e-b098b0dfa321.png) ### Log errors: The Swift_TransportException bubbles up from the mautic:emails:send command ```` root@web001:/var/www/playground# php app/console mautic:emails:send [Swift_TransportException] Expected response code 250 but got code "554", with message "554 Message rejected: Email address is not verified. The following identities failed the check in region EU-WEST-1: 55hubs Administrator <me@nope.com>, me@nope.com " mautic:emails:send [--message-limit [MESSAGE-LIMIT]] [--time-limit [TIME-LIMIT]] [--do-not-clear] [--recover-timeout [RECOVER-TIMEOUT]] [--clear-timeout [CLEAR-TIMEOUT]] [-h|--help] [-q|--quiet] [-v|vv|vvv|--verbose] [-V|--version] [--ansi] [--no-ansi] [-n|--no-interaction] [-s|--shell] [--process-isolation] [-e|--env ENV] [--no-debug] [--] <command> ````
1.0
Failed e-mail appears as sent in contact timeline - What type of report is this: | Q | A | ---| --- | Bug report? | yes | Feature request? | | Enhancement? | ## Description: Failed e-mails to a contact appear in the timeline as sent. I encountered that behaviour as I was sending a manual e-mail to a contact. I made the test with Amazon SES where you have to configure your allowed sending domains. The SMTP was dropping my mails and I think Mautic does not handle that error correctly. ## If a bug: | Q | A | --- | --- | Mautic version | 2.1.0 | PHP version | 5.6.23-1+deprecated+dontuse+deb.sury.org~trusty+1 ### Steps to reproduce: 1. Configure a mail transport that will reject certain sending domains (like Amazon SES) 1. Send a manual e-mail to that contact from the contact detail page 1. Process the mail queue from the cli (php app/console mautic:emails:send) 1. Check the contact page. The lead is marked as bounced (which is wrong, because the sending domain was wrong, not the contact). The email sent to the customer shows up as "sent", which give you the feeling that the contact received it. ![screen shot 2016-08-18 at 09 46 09](https://cloud.githubusercontent.com/assets/1575445/17766098/aec454d0-6529-11e6-810e-b098b0dfa321.png) ### Log errors: The Swift_TransportException bubbles up from the mautic:emails:send command ```` root@web001:/var/www/playground# php app/console mautic:emails:send [Swift_TransportException] Expected response code 250 but got code "554", with message "554 Message rejected: Email address is not verified. The following identities failed the check in region EU-WEST-1: 55hubs Administrator <me@nope.com>, me@nope.com " mautic:emails:send [--message-limit [MESSAGE-LIMIT]] [--time-limit [TIME-LIMIT]] [--do-not-clear] [--recover-timeout [RECOVER-TIMEOUT]] [--clear-timeout [CLEAR-TIMEOUT]] [-h|--help] [-q|--quiet] [-v|vv|vvv|--verbose] [-V|--version] [--ansi] [--no-ansi] [-n|--no-interaction] [-s|--shell] [--process-isolation] [-e|--env ENV] [--no-debug] [--] <command> ````
non_process
failed e mail appears as sent in contact timeline what type of report is this q a bug report yes feature request enhancement description failed e mails to a contact appear in the timeline as sent i encountered that behaviour as i was sending a manual e mail to a contact i made the test with amazon ses where you have to configure your allowed sending domains the smtp was dropping my mails and i think mautic does not handle that error correctly if a bug q a mautic version php version deprecated dontuse deb sury org trusty steps to reproduce configure a mail transport that will reject certain sending domains like amazon ses send a manual e mail to that contact from the contact detail page process the mail queue from the cli php app console mautic emails send check the contact page the lead is marked as bounced which is wrong because the sending domain was wrong not the contact the email sent to the customer shows up as sent which give you the feeling that the contact received it log errors the swift transportexception bubbles up from the mautic emails send command root var www playground php app console mautic emails send expected response code but got code with message message rejected email address is not verified the following identities failed the check in region eu west administrator me nope com mautic emails send
0
9,752
12,737,085,857
IssuesEvent
2020-06-25 18:05:49
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Why System.Diagnostics.Process.MainWindowTitle returns empty string on Linux?
area-System.Diagnostics.Process os-linux question
System.Diagnostics.Process.MainWindowTitle is empty for all processes on Linux, even when the application has root rights. Tested on Ubuntu 19.10 64bit with .NET Core 3.1 console application. ## Repro `sudo ./ConsoleApp2` ``` class Program { static void Main(string[] args) { var procs = Process.GetProcesses(); foreach (var proc in procs) { Console.WriteLine($"Process: {proc.ProcessName}, MainWindowTitle: {(string.IsNullOrEmpty(proc.MainWindowTitle) ? "<EMPTY>" : proc.MainWindowTitle)}"); } } } ``` ## Result: ![image](https://user-images.githubusercontent.com/14368203/73660921-0e292080-4691-11ea-9531-d6d2bf0659df.png)
1.0
Why System.Diagnostics.Process.MainWindowTitle returns empty string on Linux? - System.Diagnostics.Process.MainWindowTitle is empty for all processes on Linux, even when the application has root rights. Tested on Ubuntu 19.10 64bit with .NET Core 3.1 console application. ## Repro `sudo ./ConsoleApp2` ``` class Program { static void Main(string[] args) { var procs = Process.GetProcesses(); foreach (var proc in procs) { Console.WriteLine($"Process: {proc.ProcessName}, MainWindowTitle: {(string.IsNullOrEmpty(proc.MainWindowTitle) ? "<EMPTY>" : proc.MainWindowTitle)}"); } } } ``` ## Result: ![image](https://user-images.githubusercontent.com/14368203/73660921-0e292080-4691-11ea-9531-d6d2bf0659df.png)
process
why system diagnostics process mainwindowtitle returns empty string on linux system diagnostics process mainwindowtitle is empty for all processes on linux even when the application has root rights tested on ubuntu with net core console application repro sudo class program static void main string args var procs process getprocesses foreach var proc in procs console writeline process proc processname mainwindowtitle string isnullorempty proc mainwindowtitle proc mainwindowtitle result
1
272,876
20,761,837,264
IssuesEvent
2022-03-15 16:49:12
spring-cloud/spring-cloud-release
https://api.github.com/repos/spring-cloud/spring-cloud-release
closed
Add release documentation to combined Spring Cloud documentation
documentation
The `org.springframework.cloud:spring-cloud-dependencies` POM seems to serve the same purpose as the `io.spring.platform:platform-bom` POM (assisting with ensuring compatible transitive dependencies). I can't find any information on which versions of one work with which versions of the other. Are they meant to be used together or are they not necessarily compatible? I see in `org.springframework.cloud:spring-cloud-dependencies` that there are a lot of exclusions and most of them look like things that are in `io.spring.platform:platform-bom` but I'm not sure if this is a "best effort" to make them work together or if it's intentional to make sure they do. This isn't really an issue with the code (maybe the documentation though), but I didn't think this would get a good answer on Stack Overflow so I thought this was the best place to ask.
1.0
Add release documentation to combined Spring Cloud documentation - The `org.springframework.cloud:spring-cloud-dependencies` POM seems to serve the same purpose as the `io.spring.platform:platform-bom` POM (assisting with ensuring compatible transitive dependencies). I can't find any information on which versions of one work with which versions of the other. Are they meant to be used together or are they not necessarily compatible? I see in `org.springframework.cloud:spring-cloud-dependencies` that there are a lot of exclusions and most of them look like things that are in `io.spring.platform:platform-bom` but I'm not sure if this is a "best effort" to make them work together or if it's intentional to make sure they do. This isn't really an issue with the code (maybe the documentation though), but I didn't think this would get a good answer on Stack Overflow so I thought this was the best place to ask.
non_process
add release documentation to combined spring cloud documentation the org springframework cloud spring cloud dependencies pom seems to serve the same purpose as the io spring platform platform bom pom assisting with ensuring compatible transitive dependencies i can t find any information on which versions of one work with which versions of the other are they meant to be used together or are they not necessarily compatible i see in org springframework cloud spring cloud dependencies that there are a lot of exclusions and most of them look like things that are in io spring platform platform bom but i m not sure if this is a best effort to make them work together or if it s intentional to make sure they do this isn t really an issue with the code maybe the documentation though but i didn t think this would get a good answer on stack overflow so i thought this was the best place to ask
0
6,517
9,604,787,884
IssuesEvent
2019-05-10 21:07:23
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
How to retrain the model
assigned-to-author machine-learning/svc product-question team-data-science-process/subsvc triaged
If I update a dataset from my Python script as described here, can I also trigger the model to retrain based on the updated data from my Python script? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: c798198a-6a22-41a3-5bdc-5e4e06446a9b * Version Independent ID: 3fb2d7ca-27df-87a8-9fdf-99245438f2e9 * Content: [Access datasets with Python client library - Team Data Science Process](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/python-data-access) * Content Source: [articles/machine-learning/team-data-science-process/python-data-access.md](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/team-data-science-process/python-data-access.md) * Service: **machine-learning** * Sub-service: **team-data-science-process** * GitHub Login: @marktab * Microsoft Alias: **tdsp**
1.0
How to retrain the model - If I update a dataset from my Python script as described here, can I also trigger the model to retrain based on the updated data from my Python script? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: c798198a-6a22-41a3-5bdc-5e4e06446a9b * Version Independent ID: 3fb2d7ca-27df-87a8-9fdf-99245438f2e9 * Content: [Access datasets with Python client library - Team Data Science Process](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/python-data-access) * Content Source: [articles/machine-learning/team-data-science-process/python-data-access.md](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/team-data-science-process/python-data-access.md) * Service: **machine-learning** * Sub-service: **team-data-science-process** * GitHub Login: @marktab * Microsoft Alias: **tdsp**
process
how to retrain the model if i update a dataset from my python script as described here can i also trigger the model to retrain based on the updated data from my python script document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service machine learning sub service team data science process github login marktab microsoft alias tdsp
1
348,822
24,924,403,818
IssuesEvent
2022-10-31 05:33:00
Dun-sin/Code-Magic
https://api.github.com/repos/Dun-sin/Code-Magic
closed
[DOCS] add text shadow to the readme
documentation good first issue EddieHub:good-first-issue assigned hacktoberfest
### Description All the current generators are on the readme except text shadow. ### Screenshots _No response_ ### Additional information Tasks: - [ ] Add a short description of the text shadow generator in the readme ### 👀 Have you checked if this issue has been raised before? - [X] I checked and didn't find similar issue ### 🏢 Have you read the Contributing Guidelines? - [X] I have read and understood the rules in the [Contributing Guidelines](https://github.com/Dun-sin/Code-Magic/blob/main/CONTRIBUTING.md) ### Are you willing to work on this issue ? _No response_
1.0
[DOCS] add text shadow to the readme - ### Description All the current generators are on the readme except text shadow. ### Screenshots _No response_ ### Additional information Tasks: - [ ] Add a short description of the text shadow generator in the readme ### 👀 Have you checked if this issue has been raised before? - [X] I checked and didn't find similar issue ### 🏢 Have you read the Contributing Guidelines? - [X] I have read and understood the rules in the [Contributing Guidelines](https://github.com/Dun-sin/Code-Magic/blob/main/CONTRIBUTING.md) ### Are you willing to work on this issue ? _No response_
non_process
add text shadow to the readme description all the current generators are on the readme except text shadow screenshots no response additional information tasks add a short description of the text shadow generator in the readme 👀 have you checked if this issue has been raised before i checked and didn t find similar issue 🏢 have you read the contributing guidelines i have read and understood the rules in the are you willing to work on this issue no response
0
52,242
10,790,734,900
IssuesEvent
2019-11-05 15:28:52
eclipse-theia/theia
https://api.github.com/repos/eclipse-theia/theia
opened
[vscode] builtin extensions not recognized
plug-in system vscode
**Description** I've created a custom VS Code [`plugin`](https://github.com/vince-fugnitto/ts-tools-plugin) which simply verifies that the `vscode-builtin-typescript-language-features` builtin extension correctly works. The plugin itself works perfectly in VS Code while in Theia it fails to find the extension. **Screenshots** _VS Code:_ <div align='center'> ![image](https://user-images.githubusercontent.com/40359487/68216550-0b922e80-ffaf-11e9-81cc-e01d3a309d33.png) </div> _Theia:_ <div align='center'> ![image](https://user-images.githubusercontent.com/40359487/68216623-28c6fd00-ffaf-11e9-873f-1870f6370933.png) </div> **Setup** The following updates were made to the example-browser [package.json](https://github.com/eclipse-theia/theia/blob/master/examples/browser/package.json): _Additions:_ ```json "@theia/vscode-builtin-typescript": "0.2.1", "@theia/vscode-builtin-typescript-language-features": "0.2.1", ``` _Deletions:_ ```json "@theia/typescript": "^0.12.0" ``` --- Am I doing something wrong when attempting to consume builtin extensions in Theia?
1.0
[vscode] builtin extensions not recognized - **Description** I've created a custom VS Code [`plugin`](https://github.com/vince-fugnitto/ts-tools-plugin) which simply verifies that the `vscode-builtin-typescript-language-features` builtin extension correctly works. The plugin itself works perfectly in VS Code while in Theia it fails to find the extension. **Screenshots** _VS Code:_ <div align='center'> ![image](https://user-images.githubusercontent.com/40359487/68216550-0b922e80-ffaf-11e9-81cc-e01d3a309d33.png) </div> _Theia:_ <div align='center'> ![image](https://user-images.githubusercontent.com/40359487/68216623-28c6fd00-ffaf-11e9-873f-1870f6370933.png) </div> **Setup** The following updates were made to the example-browser [package.json](https://github.com/eclipse-theia/theia/blob/master/examples/browser/package.json): _Additions:_ ```json "@theia/vscode-builtin-typescript": "0.2.1", "@theia/vscode-builtin-typescript-language-features": "0.2.1", ``` _Deletions:_ ```json "@theia/typescript": "^0.12.0" ``` --- Am I doing something wrong when attempting to consume builtin extensions in Theia?
non_process
builtin extensions not recognized description i ve created a custom vs code which simply verifies that the vscode builtin typescript language features builtin extension correctly works the plugin itself works perfectly in vs code while in theia it fails to find the extension screenshots vs code theia setup the following updates were made to the example browser additions json theia vscode builtin typescript theia vscode builtin typescript language features deletions json theia typescript am i doing something wrong when attempting to consume builtin extensions in theia
0
436,048
12,544,557,094
IssuesEvent
2020-06-05 17:24:07
mintproject/mic
https://api.github.com/repos/mintproject/mic
closed
Detection outputs must ignore model configuration files
bug easy to fix enhancement medium priority
In my test with one input file, mic is constantly adding it as an output file after the execution, which is not correct.
1.0
Detection outputs must ignore model configuration files - In my test with one input file, mic is constantly adding it as an output file after the execution, which is not correct.
non_process
detection outputs must ignore model configuration files in my test with one input file mic is constantly adding it as an output file after the execution which is not correct
0
16,874
22,154,176,927
IssuesEvent
2022-06-03 20:20:40
0xffset/rOSt
https://api.github.com/repos/0xffset/rOSt
opened
InterProcess Communication
syscalls processes driver
We need to have some way of doing IPC. We can have it synchronous or asynchronous, or both. We can have it in the RPC style, or message-passing, or something else. There are many options and we will have to look around and select the best one.
1.0
InterProcess Communication - We need to have some way of doing IPC. We can have it synchronous or asynchronous, or both. We can have it in the RPC style, or message-passing, or something else. There are many options and we will have to look around and select the best one.
process
interprocess communication we need to have some way of doing ipc we can have it synchronous or asynchronous or both we can have it in the rpc style or message passing or something else there are many options and we will have to look around and select the best one
1
93,353
19,184,791,522
IssuesEvent
2021-12-05 01:57:07
CSC207-UofT/course-project-group-010
https://api.github.com/repos/CSC207-UofT/course-project-group-010
closed
Misplaced rating value bound check
code smell
Currently, whether a user-provided rating value is in-bounds is checked in [CourseManager](https://github.com/CSC207-UofT/course-project-group-010/blob/6e75d460d87626a94aa4c54594e901fa8b586628/src/main/java/usecase/CourseManager.java) (lines 59-61). This feels like a violation of Clean Architecture principles: why should CourseManager care about how Rating is implemented? **Suggested solution:** 1. CourseManager calls Rating constructor with parsed user rating value **in a try-except block**. 2. Check in-bounds condition in Rating constructor. a. If in-bounds, create Rating object normally. b. If out-of-bounds, throw an exception. 3. CourseManager catches the exception if one is thrown and rethrows it up to the command line. Otherwise, proceed normally.
1.0
Misplaced rating value bound check - Currently, whether a user-provided rating value is in-bounds is checked in [CourseManager](https://github.com/CSC207-UofT/course-project-group-010/blob/6e75d460d87626a94aa4c54594e901fa8b586628/src/main/java/usecase/CourseManager.java) (lines 59-61). This feels like a violation of Clean Architecture principles: why should CourseManager care about how Rating is implemented? **Suggested solution:** 1. CourseManager calls Rating constructor with parsed user rating value **in a try-except block**. 2. Check in-bounds condition in Rating constructor. a. If in-bounds, create Rating object normally. b. If out-of-bounds, throw an exception. 3. CourseManager catches the exception if one is thrown and rethrows it up to the command line. Otherwise, proceed normally.
non_process
misplaced rating value bound check currently whether a user provided rating value is in bounds is checked in lines this feels like a violation of clean architecture principles why should coursemanager care about how rating is implemented suggested solution coursemanager calls rating constructor with parsed user rating value in a try except block check in bounds condition in rating constructor a if in bounds create rating object normally b if out of bounds throw an exception coursemanager catches the exception if one is thrown and rethrows it up to the command line otherwise proceed normally
0
16,261
20,841,293,609
IssuesEvent
2022-03-21 00:14:25
duxli/duxli-css
https://api.github.com/repos/duxli/duxli-css
closed
Process Improvement: Add Formatter
process
# Process Improvement: Add Formatter A formatter is needed for improved code consistency and readability. We will need formatting for Sass/SCSS, JavaScript/TypeScript, HTML, and Markdown files. Rules that focus on code quality do not need to be a part of this issue. ## Potential Solutions There are tools like [Stylelint](https://stylelint.io/) or [ESLint](https://eslint.org/) that have more features. However, most of these solutions are only intended for a few languages. They also require more configuration. ## Proposed Solution [Prettier](https://prettier.io/docs/en/index.html) supports basic formatting for all languages that will be included in this project. It also requires minimal configuration and setup.
1.0
Process Improvement: Add Formatter - # Process Improvement: Add Formatter A formatter is needed for improved code consistency and readability. We will need formatting for Sass/SCSS, JavaScript/TypeScript, HTML, and Markdown files. Rules that focus on code quality do not need to be a part of this issue. ## Potential Solutions There are tools like [Stylelint](https://stylelint.io/) or [ESLint](https://eslint.org/) that have more features. However, most of these solutions are only intended for a few languages. They also require more configuration. ## Proposed Solution [Prettier](https://prettier.io/docs/en/index.html) supports basic formatting for all languages that will be included in this project. It also requires minimal configuration and setup.
process
process improvement add formatter process improvement add formatter a formatter is needed for improved code consistency and readability we will need formatting for sass scss javascript typescript html and markdown files rules that focus on code quality do not need to be a part of this issue potential solutions there are tools like or that have more features however most of these solutions are only intended for a few languages they also require more configuration proposed solution supports basic formatting for all languages that will be included in this project it also requires minimal configuration and setup
1
19,530
25,841,229,914
IssuesEvent
2022-12-13 00:38:12
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
CHEFE DE TI na [IESPsicologia]
SALVADOR INFRAESTRUTURA BANCO DE DADOS SERVIDOR PROCESSOS BACKUP HELP WANTED Stale
# CHEFE DE TI 1. **Escolaridade:** Superior completo (Sistemas de Informação ou Engenharia da Computação). 2. **Atribuições:** Gerenciar as atividades da área de TI, envolvendo hardware, sistemas, banco de dados, servidores, internet, back up e implantação de processos. Gerir equipe e processos de trabalho da área. 3. **Pré-requisitos:** Experiência na função. Experiência com liderança. Disponibilidade para viagem por demanda. 4. **Salário:** R$4.000,00 + Benefícios. 5. **Local:** Salvador/BA. > Os interessados deverão encaminhar o currículo para o endereço andrecoutinho@iespsicologia.com.br ou trabalho@iespsicologia.com.br, informando o nome do cargo no título do e-mail "CHEFE DE TI".
1.0
CHEFE DE TI na [IESPsicologia] - # CHEFE DE TI 1. **Escolaridade:** Superior completo (Sistemas de Informação ou Engenharia da Computação). 2. **Atribuições:** Gerenciar as atividades da área de TI, envolvendo hardware, sistemas, banco de dados, servidores, internet, back up e implantação de processos. Gerir equipe e processos de trabalho da área. 3. **Pré-requisitos:** Experiência na função. Experiência com liderança. Disponibilidade para viagem por demanda. 4. **Salário:** R$4.000,00 + Benefícios. 5. **Local:** Salvador/BA. > Os interessados deverão encaminhar o currículo para o endereço andrecoutinho@iespsicologia.com.br ou trabalho@iespsicologia.com.br, informando o nome do cargo no título do e-mail "CHEFE DE TI".
process
chefe de ti na chefe de ti escolaridade superior completo sistemas de informação ou engenharia da computação atribuições gerenciar as atividades da área de ti envolvendo hardware sistemas banco de dados servidores internet back up e implantação de processos gerir equipe e processos de trabalho da área pré requisitos experiência na função experiência com liderança disponibilidade para viagem por demanda salário r benefícios local salvador ba os interessados deverão encaminhar o currículo para o endereço andrecoutinho iespsicologia com br ou trabalho iespsicologia com br informando o nome do cargo no título do e mail chefe de ti
1
132,356
18,714,978,246
IssuesEvent
2021-11-03 02:28:31
department-of-veterans-affairs/va.gov-cms
https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms
opened
MVP editorial experience for billing and insurance, register for care, medical records pages, and non-clinical facility services
Design Epic Content governance Content forms Needs refining UX writing
## Background ### User Story or Problem Statement This is a subepic for creating an editorial experience for PAOs to manage Top Task pages. There are a few known issues, and a few unknown opportunities that require some design discovery and thinking. Some of the issues in this epic block roll out of these pages. ### Affected users and stakeholders * Editors at VAMCs ## Design principles Veteran-centered - [ ] `Single source of truth`: Increase reliability and consistency of content on VA.gov by providing a single source of truth. - [ ] `Accessible, plain language`: Provide guardrails and guidelines to ensure content quality. - [x] `Purposely structured content`: Ensure Content API can deliver content whose meaning matches its structure. - [x] `Content lifecycle governance`: Produce tools, processes and policies to maintain content quality throughout its lifecycle. Editor-centered - [ ] `Purpose-driven`: Create an opportunity to involve the editor community in VA’s mission and content strategy goals. - [x] `Efficient`: Remove distractions and create clear, straightforward paths to get the job done. - [ ] `Approachable`: Offer friendly guidance over authoritative instruction. - [x] `Consistent`: Reduce user’s mental load by allowing them to fall back on pattern recognition to complete tasks. - [x] `Empowering`: Provide clear information to help editors make decisions about their work. ### CMS Team Please leave only the team that will do this work selected. If you're not sure, it's fine to leave both selected. - [ ] `Platform CMS Team` - [x] `Sitewide CMS Team`
1.0
MVP editorial experience for billing and insurance, register for care, medical records pages, and non-clinical facility services - ## Background ### User Story or Problem Statement This is a subepic for creating an editorial experience for PAOs to manage Top Task pages. There are a few known issues, and a few unknown opportunities that require some design discovery and thinking. Some of the issues in this epic block roll out of these pages. ### Affected users and stakeholders * Editors at VAMCs ## Design principles Veteran-centered - [ ] `Single source of truth`: Increase reliability and consistency of content on VA.gov by providing a single source of truth. - [ ] `Accessible, plain language`: Provide guardrails and guidelines to ensure content quality. - [x] `Purposely structured content`: Ensure Content API can deliver content whose meaning matches its structure. - [x] `Content lifecycle governance`: Produce tools, processes and policies to maintain content quality throughout its lifecycle. Editor-centered - [ ] `Purpose-driven`: Create an opportunity to involve the editor community in VA’s mission and content strategy goals. - [x] `Efficient`: Remove distractions and create clear, straightforward paths to get the job done. - [ ] `Approachable`: Offer friendly guidance over authoritative instruction. - [x] `Consistent`: Reduce user’s mental load by allowing them to fall back on pattern recognition to complete tasks. - [x] `Empowering`: Provide clear information to help editors make decisions about their work. ### CMS Team Please leave only the team that will do this work selected. If you're not sure, it's fine to leave both selected. - [ ] `Platform CMS Team` - [x] `Sitewide CMS Team`
non_process
mvp editorial experience for billing and insurance register for care medical records pages and non clinical facility services background user story or problem statement this is a subepic for creating an editorial experience for paos to manage top task pages there are a few known issues and a few unknown opportunities that require some design discovery and thinking some of the issues in this epic block roll out of these pages affected users and stakeholders editors at vamcs design principles veteran centered single source of truth increase reliability and consistency of content on va gov by providing a single source of truth accessible plain language provide guardrails and guidelines to ensure content quality purposely structured content ensure content api can deliver content whose meaning matches its structure content lifecycle governance produce tools processes and policies to maintain content quality throughout its lifecycle editor centered purpose driven create an opportunity to involve the editor community in va’s mission and content strategy goals efficient remove distractions and create clear straightforward paths to get the job done approachable offer friendly guidance over authoritative instruction consistent reduce user’s mental load by allowing them to fall back on pattern recognition to complete tasks empowering provide clear information to help editors make decisions about their work cms team please leave only the team that will do this work selected if you re not sure it s fine to leave both selected platform cms team sitewide cms team
0
72,976
24,392,217,261
IssuesEvent
2022-10-04 16:04:13
vector-im/element-android
https://api.github.com/repos/vector-im/element-android
closed
Collated notifications can't count
T-Defect A-Notifications S-Tolerable O-Occasional
Where there are more than one notification, the collated version always says there's only one notification, no matter how many there are. <img alt="Screenshot_20200213-151725_Discord" src="https://user-images.githubusercontent.com/307334/74477144-bbade800-4e78-11ea-860c-8d061bfabc45.jpg" width="400">
1.0
Collated notifications can't count - Where there are more than one notification, the collated version always says there's only one notification, no matter how many there are. <img alt="Screenshot_20200213-151725_Discord" src="https://user-images.githubusercontent.com/307334/74477144-bbade800-4e78-11ea-860c-8d061bfabc45.jpg" width="400">
non_process
collated notifications can t count where there are more than one notification the collated version always says there s only one notification no matter how many there are
0
4,758
7,621,514,946
IssuesEvent
2018-05-03 08:50:37
our-city-app/oca-backend
https://api.github.com/repos/our-city-app/oca-backend
closed
After this service is disabled and all users are disconnected this service remains on my phone
priority_minor process_duplicate type_bug
![schermafbeelding 2017-11-30 om 15 26 54](https://user-images.githubusercontent.com/26439611/33435638-f1ff4806-d5e2-11e7-868b-3e7929550c5b.png) ![schermafbeelding 2017-11-30 om 15 25 13](https://user-images.githubusercontent.com/26439611/33435639-f4e185b6-d5e2-11e7-8007-98f864df6ec4.png) ![image 2017-11-30 15 27 31](https://user-images.githubusercontent.com/26439611/33435646-fc710c84-d5e2-11e7-8904-419c6863cf5d.jpg)
1.0
After this service is disabled and all users are disconnected this service remains on my phone - ![schermafbeelding 2017-11-30 om 15 26 54](https://user-images.githubusercontent.com/26439611/33435638-f1ff4806-d5e2-11e7-868b-3e7929550c5b.png) ![schermafbeelding 2017-11-30 om 15 25 13](https://user-images.githubusercontent.com/26439611/33435639-f4e185b6-d5e2-11e7-8007-98f864df6ec4.png) ![image 2017-11-30 15 27 31](https://user-images.githubusercontent.com/26439611/33435646-fc710c84-d5e2-11e7-8904-419c6863cf5d.jpg)
process
after this service is disabled and all users are disconnected this service remains on my phone
1
8,913
12,016,983,588
IssuesEvent
2020-04-10 17:20:13
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
[mono] Tests failed on windows: System.Diagnostics.Tests
area-System.Diagnostics.Process untriaged
Several `System.Diagnostics.Tests` tests fail on windows with the following output: ``` System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_BadVerb(useShellExecute: True) [FAIL] Assert.Throws() Failure Expected: typeof(System.ComponentModel.Win32Exception) Actual: typeof(System.InvalidOperationException): Failed to set the specified COM apartment state. ---- System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1095,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass48_0.<StartInfo_BadVerb>b__0() ----- Inner Stack Trace ----- _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1095,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass48_0.<StartInfo_BadVerb>b__0() System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_BadExe(useShellExecute: True) [FAIL] Assert.Throws() Failure Expected: typeof(System.ComponentModel.Win32Exception) Actual: typeof(System.InvalidOperationException): Failed to set the specified COM apartment state. ---- System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1118,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass49_0.<StartInfo_BadExe>b__0() ----- Inner Stack Trace ----- _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1118,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass49_0.<StartInfo_BadExe>b__0() System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(filenameAsUrl: True) [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(247,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(Boolean filenameAsUrl) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(filenameAsUrl: False) [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(247,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(Boolean filenameAsUrl) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_WorkingDirectory [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(309,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_WorkingDirectory() _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) Unhandled Exception: System.InvalidOperationException: Failed to set the specified COM apartment state. at System.Threading.Thread.SetApartmentState(ApartmentState state) in _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs:line 234 at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 180 at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 90 at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 25 at System.Diagnostics.Process.Start() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1221 at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1270 at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) in _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs:line 279 at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) in _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs:line 359 --- End of stack trace from previous location --- at Microsoft.DotNet.RemoteExecutor.Program.Main(String[] args) in /_/src/Microsoft.DotNet.RemoteExecutor/src/Program.cs:line 95 [ERROR] FATAL UNHANDLED EXCEPTION: System.InvalidOperationException: Failed to set the specified COM apartment state. at System.Threading.Thread.SetApartmentState(ApartmentState state) in _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs:line 234 at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 180 at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 90 at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 25 at System.Diagnostics.Process.Start() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1221 at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1270 at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) in _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs:line 279 at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) in _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs:line 359 --- End of stack trace from previous location --- at Microsoft.DotNet.RemoteExecutor.Program.Main(String[] args) in /_/src/Microsoft.DotNet.RemoteExecutor/src/Program.cs:line 95 Microsoft.DotNet.RemoteExecutor.RemoteExecutionException : Remote process failed with an unhandled exception. System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_ExecuteOrder [FAIL] Stack Trace: Child exception: System.InvalidOperationException: Failed to set the specified COM apartment state. _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(279,0): at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) Child process: System.Diagnostics.Process.Tests, Version=5.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51 System.Diagnostics.Tests.ProcessTests Int32 <ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(System.String) Child arguments: C:\Users\user\AppData\Local\Temp\ProcessTests_3yhpoi2n.iir\Path ``` The failing tests will be marked with `ActiveIssue` in https://github.com/dotnet/runtime/pull/32592.
1.0
[mono] Tests failed on windows: System.Diagnostics.Tests - Several `System.Diagnostics.Tests` tests fail on windows with the following output: ``` System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_BadVerb(useShellExecute: True) [FAIL] Assert.Throws() Failure Expected: typeof(System.ComponentModel.Win32Exception) Actual: typeof(System.InvalidOperationException): Failed to set the specified COM apartment state. ---- System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1095,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass48_0.<StartInfo_BadVerb>b__0() ----- Inner Stack Trace ----- _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1095,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass48_0.<StartInfo_BadVerb>b__0() System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_BadExe(useShellExecute: True) [FAIL] Assert.Throws() Failure Expected: typeof(System.ComponentModel.Win32Exception) Actual: typeof(System.InvalidOperationException): Failed to set the specified COM apartment state. ---- System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1118,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass49_0.<StartInfo_BadExe>b__0() ----- Inner Stack Trace ----- _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs(1118,0): at System.Diagnostics.Tests.ProcessStartInfoTests.<>c__DisplayClass49_0.<StartInfo_BadExe>b__0() System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(filenameAsUrl: True) [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(247,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(Boolean filenameAsUrl) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(filenameAsUrl: False) [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(247,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_Executes(Boolean filenameAsUrl) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_WorkingDirectory [FAIL] System.InvalidOperationException : Failed to set the specified COM apartment state. Stack Trace: _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(309,0): at System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_WorkingDirectory() _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) Unhandled Exception: System.InvalidOperationException: Failed to set the specified COM apartment state. at System.Threading.Thread.SetApartmentState(ApartmentState state) in _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs:line 234 at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 180 at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 90 at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 25 at System.Diagnostics.Process.Start() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1221 at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1270 at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) in _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs:line 279 at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) in _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs:line 359 --- End of stack trace from previous location --- at Microsoft.DotNet.RemoteExecutor.Program.Main(String[] args) in /_/src/Microsoft.DotNet.RemoteExecutor/src/Program.cs:line 95 [ERROR] FATAL UNHANDLED EXCEPTION: System.InvalidOperationException: Failed to set the specified COM apartment state. at System.Threading.Thread.SetApartmentState(ApartmentState state) in _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs:line 234 at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 180 at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 90 at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 25 at System.Diagnostics.Process.Start() in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1221 at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1270 at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) in _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs:line 279 at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) in _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs:line 359 --- End of stack trace from previous location --- at Microsoft.DotNet.RemoteExecutor.Program.Main(String[] args) in /_/src/Microsoft.DotNet.RemoteExecutor/src/Program.cs:line 95 Microsoft.DotNet.RemoteExecutor.RemoteExecutionException : Remote process failed with an unhandled exception. System.Diagnostics.Tests.ProcessTests.ProcessStart_UseShellExecute_ExecuteOrder [FAIL] Stack Trace: Child exception: System.InvalidOperationException: Failed to set the specified COM apartment state. _\src\libraries\System.Private.CoreLib\src\System\Threading\Thread.cs(234,0): at System.Threading.Thread.SetApartmentState(ApartmentState state) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(180,0): at System.Diagnostics.Process.ShellExecuteHelper.ShellExecuteOnSTAThread() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(90,0): at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs(25,0): at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1221,0): at System.Diagnostics.Process.Start() _\src\libraries\System.Diagnostics.Process\src\System\Diagnostics\Process.cs(1270,0): at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) _\src\libraries\System.Diagnostics.Process\tests\ProcessTests.cs(279,0): at System.Diagnostics.Tests.ProcessTests.<ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(String pathDirectory) _\src\mono\netcore\System.Private.CoreLib\src\System\Reflection\RuntimeMethodInfo.cs(359,0): at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) Child process: System.Diagnostics.Process.Tests, Version=5.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51 System.Diagnostics.Tests.ProcessTests Int32 <ProcessStart_UseShellExecute_ExecuteOrder>b__13_0(System.String) Child arguments: C:\Users\user\AppData\Local\Temp\ProcessTests_3yhpoi2n.iir\Path ``` The failing tests will be marked with `ActiveIssue` in https://github.com/dotnet/runtime/pull/32592.
process
tests failed on windows system diagnostics tests several system diagnostics tests tests fail on windows with the following output system diagnostics tests processstartinfotests startinfo badverb useshellexecute true assert throws failure expected typeof system componentmodel actual typeof system invalidoperationexception failed to set the specified com apartment state system invalidoperationexception failed to set the specified com apartment state stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processstartinfotests cs at system diagnostics tests processstartinfotests c b inner stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processstartinfotests cs at system diagnostics tests processstartinfotests c b system diagnostics tests processstartinfotests startinfo badexe useshellexecute true assert throws failure expected typeof system componentmodel actual typeof system invalidoperationexception failed to set the specified com apartment state system invalidoperationexception failed to set the specified com apartment state stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processstartinfotests cs at system diagnostics tests processstartinfotests c b inner stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processstartinfotests cs at system diagnostics tests processstartinfotests c b system diagnostics tests processtests processstart useshellexecute executes filenameasurl true system invalidoperationexception failed to set the specified com apartment state stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processtests cs at system diagnostics tests processtests processstart useshellexecute executes boolean filenameasurl src mono netcore system private corelib src system reflection runtimemethodinfo cs at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture system diagnostics tests processtests processstart useshellexecute executes filenameasurl false system invalidoperationexception failed to set the specified com apartment state stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processtests cs at system diagnostics tests processtests processstart useshellexecute executes boolean filenameasurl src mono netcore system private corelib src system reflection runtimemethodinfo cs at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture system diagnostics tests processtests processstart useshellexecute workingdirectory system invalidoperationexception failed to set the specified com apartment state stack trace src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processtests cs at system diagnostics tests processtests processstart useshellexecute workingdirectory src mono netcore system private corelib src system reflection runtimemethodinfo cs at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture unhandled exception system invalidoperationexception failed to set the specified com apartment state at system threading thread setapartmentstate apartmentstate state in src libraries system private corelib src system threading thread cs line at system diagnostics process shellexecutehelper shellexecuteonstathread in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process startwithshellexecuteex processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process startcore processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process start in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process start processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics tests processtests b string pathdirectory in src libraries system diagnostics process tests processtests cs line at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture in src mono netcore system private corelib src system reflection runtimemethodinfo cs line end of stack trace from previous location at microsoft dotnet remoteexecutor program main string args in src microsoft dotnet remoteexecutor src program cs line fatal unhandled exception system invalidoperationexception failed to set the specified com apartment state at system threading thread setapartmentstate apartmentstate state in src libraries system private corelib src system threading thread cs line at system diagnostics process shellexecutehelper shellexecuteonstathread in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process startwithshellexecuteex processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process startcore processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process start in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics process start processstartinfo startinfo in src libraries system diagnostics process src system diagnostics process cs line at system diagnostics tests processtests b string pathdirectory in src libraries system diagnostics process tests processtests cs line at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture in src mono netcore system private corelib src system reflection runtimemethodinfo cs line end of stack trace from previous location at microsoft dotnet remoteexecutor program main string args in src microsoft dotnet remoteexecutor src program cs line microsoft dotnet remoteexecutor remoteexecutionexception remote process failed with an unhandled exception system diagnostics tests processtests processstart useshellexecute executeorder stack trace child exception system invalidoperationexception failed to set the specified com apartment state src libraries system private corelib src system threading thread cs at system threading thread setapartmentstate apartmentstate state src libraries system diagnostics process src system diagnostics process cs at system diagnostics process shellexecutehelper shellexecuteonstathread src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startwithshellexecuteex processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process startcore processstartinfo startinfo src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start src libraries system diagnostics process src system diagnostics process cs at system diagnostics process start processstartinfo startinfo src libraries system diagnostics process tests processtests cs at system diagnostics tests processtests b string pathdirectory src mono netcore system private corelib src system reflection runtimemethodinfo cs at system reflection runtimemethodinfo invoke object obj bindingflags invokeattr binder binder object parameters cultureinfo culture child process system diagnostics process tests version culture neutral publickeytoken system diagnostics tests processtests b system string child arguments c users user appdata local temp processtests iir path the failing tests will be marked with activeissue in
1
16,529
12,017,986,896
IssuesEvent
2020-04-10 19:42:22
tonio73/dnnviewer
https://api.github.com/repos/tonio73/dnnviewer
opened
Setup the Dash app as a multi page application
enhancement infrastructure
- Create a layout function as in Dash documentation referenced below - Temporary "Splash screen" with 2s timeout, to be replaced later by model selection screen https://dash.plotly.com/urls
1.0
Setup the Dash app as a multi page application - - Create a layout function as in Dash documentation referenced below - Temporary "Splash screen" with 2s timeout, to be replaced later by model selection screen https://dash.plotly.com/urls
non_process
setup the dash app as a multi page application create a layout function as in dash documentation referenced below temporary splash screen with timeout to be replaced later by model selection screen
0
597,728
18,170,326,536
IssuesEvent
2021-09-27 19:11:36
cdklabs/construct-hub-webapp
https://api.github.com/repos/cdklabs/construct-hub-webapp
closed
Trivial keywords should not be displayed as tags on a package card/page
priority/p0
Trivial keywords are: cdk aws-cdk awscdk aws
1.0
Trivial keywords should not be displayed as tags on a package card/page - Trivial keywords are: cdk aws-cdk awscdk aws
non_process
trivial keywords should not be displayed as tags on a package card page trivial keywords are cdk aws cdk awscdk aws
0
14,364
17,384,336,719
IssuesEvent
2021-08-01 10:17:31
jscutlery/test-utils
https://api.github.com/repos/jscutlery/test-utils
closed
Angular preprocessor should fail if tsConfig option is missing
cypress-angular-preprocessor enhancement
We should make sure that `cypress` is started with `--ts-config` option. It is the default setup with Nx but not with custom configurations.
1.0
Angular preprocessor should fail if tsConfig option is missing - We should make sure that `cypress` is started with `--ts-config` option. It is the default setup with Nx but not with custom configurations.
process
angular preprocessor should fail if tsconfig option is missing we should make sure that cypress is started with ts config option it is the default setup with nx but not with custom configurations
1
18,780
24,682,028,585
IssuesEvent
2022-10-18 22:28:31
xcesco/kripton
https://api.github.com/repos/xcesco/kripton
closed
On empty input, a empty object was generated
bug annotation-processor module file module
Given the class ```java @BindType public class Rifornimento { public ZonedDateTime getDataRifornimento() { return dataRifornimento; } public String getTarga() { return targa; } private final ZonedDateTime dataRifornimento; private final String targa; public Rifornimento(ZonedDateTime dataRifornimento, String targa) { this.dataRifornimento = dataRifornimento; this.targa = targa; } } ``` The following test fails: ```java @Test public void testRun() throws Exception { String input = ""; Rifornimento result = KriptonBinder.jsonBind().parse(input, Rifornimento.class); assertTrue(result == null); } ``` Same result from mutable objects.
1.0
On empty input, a empty object was generated - Given the class ```java @BindType public class Rifornimento { public ZonedDateTime getDataRifornimento() { return dataRifornimento; } public String getTarga() { return targa; } private final ZonedDateTime dataRifornimento; private final String targa; public Rifornimento(ZonedDateTime dataRifornimento, String targa) { this.dataRifornimento = dataRifornimento; this.targa = targa; } } ``` The following test fails: ```java @Test public void testRun() throws Exception { String input = ""; Rifornimento result = KriptonBinder.jsonBind().parse(input, Rifornimento.class); assertTrue(result == null); } ``` Same result from mutable objects.
process
on empty input a empty object was generated given the class java bindtype public class rifornimento public zoneddatetime getdatarifornimento return datarifornimento public string gettarga return targa private final zoneddatetime datarifornimento private final string targa public rifornimento zoneddatetime datarifornimento string targa this datarifornimento datarifornimento this targa targa the following test fails java test public void testrun throws exception string input rifornimento result kriptonbinder jsonbind parse input rifornimento class asserttrue result null same result from mutable objects
1
83,854
10,440,480,146
IssuesEvent
2019-09-18 08:48:43
juliandierker/yUOShi
https://api.github.com/repos/juliandierker/yUOShi
closed
X Wert Theorie: Graphiken einbinden
design
liegen im Ordner in einer PPP und sollen graphisch eingebunden werden (wie im dok Mot 3 X wert final)
1.0
X Wert Theorie: Graphiken einbinden - liegen im Ordner in einer PPP und sollen graphisch eingebunden werden (wie im dok Mot 3 X wert final)
non_process
x wert theorie graphiken einbinden liegen im ordner in einer ppp und sollen graphisch eingebunden werden wie im dok mot x wert final
0
4,010
6,948,593,861
IssuesEvent
2017-12-06 01:11:41
LibreHealthIO/LibreEHR
https://api.github.com/repos/LibreHealthIO/LibreEHR
closed
Change "Appointments" in the demographic screen to "Future Appointments".
GCI Task Work in Process
Change the title of the appointment widget in demographics screen to read "Future Appointments" ![issue-gci2](https://user-images.githubusercontent.com/285835/31579622-ace134f0-b0ee-11e7-8192-95df58971ef7.png) interface/patient_file/summary/demographics.php 1490: $widgetTitle = xl("Appointments");
1.0
Change "Appointments" in the demographic screen to "Future Appointments". - Change the title of the appointment widget in demographics screen to read "Future Appointments" ![issue-gci2](https://user-images.githubusercontent.com/285835/31579622-ace134f0-b0ee-11e7-8192-95df58971ef7.png) interface/patient_file/summary/demographics.php 1490: $widgetTitle = xl("Appointments");
process
change appointments in the demographic screen to future appointments change the title of the appointment widget in demographics screen to read future appointments interface patient file summary demographics php widgettitle xl appointments
1
21,120
28,089,814,245
IssuesEvent
2023-03-30 12:21:22
googleapis/google-cloud-php
https://api.github.com/repos/googleapis/google-cloud-php
closed
Build status is incorrect
priority: p1 type: process
Build stats on repo readme page is showing failure. ![Screenshot 2022-10-06 at 15 07 14](https://user-images.githubusercontent.com/7369612/194280086-db285a48-1d6b-46ad-8a3d-ad8a43f559e6.png) ## Issues 1. It should reflect correct test labels - PHP 72 tests were run months back and hasn't been updated since - PHP 74 CI tests should be the label 3. It should be green (unless any active issue is blocking it) - Consider adding more memory in Kokoro's `php.ini`, it's failing during tests execution:
1.0
Build status is incorrect - Build stats on repo readme page is showing failure. ![Screenshot 2022-10-06 at 15 07 14](https://user-images.githubusercontent.com/7369612/194280086-db285a48-1d6b-46ad-8a3d-ad8a43f559e6.png) ## Issues 1. It should reflect correct test labels - PHP 72 tests were run months back and hasn't been updated since - PHP 74 CI tests should be the label 3. It should be green (unless any active issue is blocking it) - Consider adding more memory in Kokoro's `php.ini`, it's failing during tests execution:
process
build status is incorrect build stats on repo readme page is showing failure issues it should reflect correct test labels php tests were run months back and hasn t been updated since php ci tests should be the label it should be green unless any active issue is blocking it consider adding more memory in kokoro s php ini it s failing during tests execution
1
10,689
13,481,866,941
IssuesEvent
2020-09-11 00:08:26
googleapis/python-speech
https://api.github.com/repos/googleapis/python-speech
closed
GCP Speaker diarization with time-stamps
api: speech type: process
Hi, I have an audio file which i am transcribing using speech_v1p1beta1.SpeechClient() . While the timestamps are at the word level, i would want time-stamps for each transcript along with speaker tag. Am using the below code to generate the response: ```py client = speech_v1p1beta1.SpeechClient() gcs_uri = 'gs://'+ 'emo_det_1' + '/' + 'sample.wav' diarization_speaker_count=3 config = { "sample_rate_hertz": 44100, "language_code": "en-IN", "audio_channel_count": 2, "encoding": enums.RecognitionConfig.AudioEncoding.LINEAR16, "enable_word_time_offsets": True, #"enable_separate_recognition_per_channel": True, "enable_automatic_punctuation": True, "enable_speaker_diarization": True, "diarization_speaker_count":diarization_speaker_count, # "model": 'video', not supported for language en-IN #"use_enhanced": True, #"speechContexts": [{"phrases": "quarter"}], } audio = {"uri": gcs_uri} operation = client.long_running_recognize(config, audio) print(u'transcription in process...') response = operation.result() ``` Now i would want the output in the following format: ``` sentence | sentence_id | speaker_tag | num_speakers | start_time | stop_time num | sentence | speaker_id | speaker_tag | num_of_speakers | start time | end_time 0 | And then with regards to your partner, here's ... | 1 | 4 | 1 | 0 | 35 1 | That's not why is it you're denigrating the re... | 2 | 4 | 1 | 35 | 65 2 | It's like it's certainly possible that you mar... | 3 | 4 | 1 | 65 | 102 ``` It would be of great help if someone can guide me.
1.0
GCP Speaker diarization with time-stamps - Hi, I have an audio file which i am transcribing using speech_v1p1beta1.SpeechClient() . While the timestamps are at the word level, i would want time-stamps for each transcript along with speaker tag. Am using the below code to generate the response: ```py client = speech_v1p1beta1.SpeechClient() gcs_uri = 'gs://'+ 'emo_det_1' + '/' + 'sample.wav' diarization_speaker_count=3 config = { "sample_rate_hertz": 44100, "language_code": "en-IN", "audio_channel_count": 2, "encoding": enums.RecognitionConfig.AudioEncoding.LINEAR16, "enable_word_time_offsets": True, #"enable_separate_recognition_per_channel": True, "enable_automatic_punctuation": True, "enable_speaker_diarization": True, "diarization_speaker_count":diarization_speaker_count, # "model": 'video', not supported for language en-IN #"use_enhanced": True, #"speechContexts": [{"phrases": "quarter"}], } audio = {"uri": gcs_uri} operation = client.long_running_recognize(config, audio) print(u'transcription in process...') response = operation.result() ``` Now i would want the output in the following format: ``` sentence | sentence_id | speaker_tag | num_speakers | start_time | stop_time num | sentence | speaker_id | speaker_tag | num_of_speakers | start time | end_time 0 | And then with regards to your partner, here's ... | 1 | 4 | 1 | 0 | 35 1 | That's not why is it you're denigrating the re... | 2 | 4 | 1 | 35 | 65 2 | It's like it's certainly possible that you mar... | 3 | 4 | 1 | 65 | 102 ``` It would be of great help if someone can guide me.
process
gcp speaker diarization with time stamps hi i have an audio file which i am transcribing using speech speechclient while the timestamps are at the word level i would want time stamps for each transcript along with speaker tag am using the below code to generate the response py client speech speechclient gcs uri gs emo det sample wav diarization speaker count config sample rate hertz language code en in audio channel count encoding enums recognitionconfig audioencoding enable word time offsets true enable separate recognition per channel true enable automatic punctuation true enable speaker diarization true diarization speaker count diarization speaker count model video not supported for language en in use enhanced true speechcontexts audio uri gcs uri operation client long running recognize config audio print u transcription in process response operation result now i would want the output in the following format sentence sentence id speaker tag num speakers start time stop time num sentence speaker id speaker tag num of speakers start time end time and then with regards to your partner here s that s not why is it you re denigrating the re it s like it s certainly possible that you mar it would be of great help if someone can guide me
1
346,416
30,913,031,259
IssuesEvent
2023-08-05 00:55:19
wazuh/wazuh
https://api.github.com/repos/wazuh/wazuh
opened
Release 4.5.0 - Revision 1 - Release Candidate RC1 - Footprint Metrics - ALL-EXCEPT-VULNERABILITY-DETECTOR (6h)
type/test tracking level/subtask type/release
## Footprint metrics information | | | |---------------------------------|--------------------------------------------| | **Main release candidate issue #** | #18235 | | **Main footprint metrics issue #** | #18249 | | **Version** | 4.5.0 | | **Release candidate #** | RC1 | | **Tag** | https://github.com/wazuh/wazuh/tree/4.5.0-rc1 | ## Stress test documentation ### Packages used - Repository: `packages-dev.wazuh.com` - Package path: `pre-release` - Package revision: `1` - **Jenkins build**: https://ci.wazuh.info/job/Test_stress/4236/ --- <details><summary>Manager</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Decoded_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Dropped_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_EDPS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Written_stats.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_state_Number_Events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_state_Queues_state.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_manager_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/logs/ossec_Test_stress_B4236_manager_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-manager-Test_stress_B4236_manager-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/monitor-manager-Test_stress_B4236_manager-pre-release.csv) [Test_stress_B4236_manager_analysisd_events.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_analysisd_events.csv) [Test_stress_B4236_manager_analysisd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_analysisd_state.csv) [Test_stress_B4236_manager_remoted_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_remoted_state.csv) </details> </details> <details><summary>Centos agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_centos_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/logs/ossec_Test_stress_B4236_centos_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-agent-Test_stress_B4236_centos-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/data/monitor-agent-Test_stress_B4236_centos-pre-release.csv) [Test_stress_B4236_centos_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/data/Test_stress_B4236_centos_agentd_state.csv) </details> </details> <details><summary>Ubuntu agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_ubuntu_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/logs/ossec_Test_stress_B4236_ubuntu_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-agent-Test_stress_B4236_ubuntu-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/data/monitor-agent-Test_stress_B4236_ubuntu-pre-release.csv) [Test_stress_B4236_ubuntu_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/data/Test_stress_B4236_ubuntu_agentd_state.csv) </details> </details> <details><summary>Windows agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Handles.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_windows_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/logs/ossec_Test_stress_B4236_windows_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-winagent-Test_stress_B4236_windows-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/data/monitor-winagent-Test_stress_B4236_windows-pre-release.csv) [Test_stress_B4236_windows_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/data/Test_stress_B4236_windows_agentd_state.csv) </details> </details> <details><summary>macOS agent</summary> + <details><summary>Plots</summary> </details> + <details><summary>Logs and configuration</summary> </details> + <details><summary>CSV</summary> </details> </details> <details><summary>Solaris agent</summary> + <details><summary>Plots</summary> </details> + <details><summary>Logs and configuration</summary> </details> + <details><summary>CSV</summary> </details> </details>
1.0
Release 4.5.0 - Revision 1 - Release Candidate RC1 - Footprint Metrics - ALL-EXCEPT-VULNERABILITY-DETECTOR (6h) - ## Footprint metrics information | | | |---------------------------------|--------------------------------------------| | **Main release candidate issue #** | #18235 | | **Main footprint metrics issue #** | #18249 | | **Version** | 4.5.0 | | **Release candidate #** | RC1 | | **Tag** | https://github.com/wazuh/wazuh/tree/4.5.0-rc1 | ## Stress test documentation ### Packages used - Repository: `packages-dev.wazuh.com` - Package path: `pre-release` - Package revision: `1` - **Jenkins build**: https://ci.wazuh.info/job/Test_stress/4236/ --- <details><summary>Manager</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/monitor-manager-Test_stress_B4236_manager-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Decoded_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Dropped_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_EDPS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_events_Written_stats.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_state_Number_Events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/plots/Test_stress_B4236_manager_analysisd_state_Queues_state.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_manager_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/logs/ossec_Test_stress_B4236_manager_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-manager-Test_stress_B4236_manager-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/monitor-manager-Test_stress_B4236_manager-pre-release.csv) [Test_stress_B4236_manager_analysisd_events.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_analysisd_events.csv) [Test_stress_B4236_manager_analysisd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_analysisd_state.csv) [Test_stress_B4236_manager_remoted_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_manager_centos/data/Test_stress_B4236_manager_remoted_state.csv) </details> </details> <details><summary>Centos agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/monitor-agent-Test_stress_B4236_centos-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/plots/Test_stress_B4236_centos_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_centos_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/logs/ossec_Test_stress_B4236_centos_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-agent-Test_stress_B4236_centos-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/data/monitor-agent-Test_stress_B4236_centos-pre-release.csv) [Test_stress_B4236_centos_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_centos/data/Test_stress_B4236_centos_agentd_state.csv) </details> </details> <details><summary>Ubuntu agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_FD.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_PSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_SWAP.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/monitor-agent-Test_stress_B4236_ubuntu-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/plots/Test_stress_B4236_ubuntu_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_ubuntu_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/logs/ossec_Test_stress_B4236_ubuntu_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-agent-Test_stress_B4236_ubuntu-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/data/monitor-agent-Test_stress_B4236_ubuntu-pre-release.csv) [Test_stress_B4236_ubuntu_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_ubuntu/data/Test_stress_B4236_ubuntu_agentd_state.csv) </details> </details> <details><summary>Windows agent</summary> + <details><summary>Plots</summary> ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_CPU.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk_Read.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Disk_Written.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Handles.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Read_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_RSS_MAXMIN.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_RSS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_USS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_VMS.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/monitor-winagent-Test_stress_B4236_windows-pre-release_Write_Ops.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_events_buffered.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_generated_events.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Number_of_messages.png) ![](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/plots/Test_stress_B4236_windows_agentd_state_AgentD_Status.png) </details> + <details><summary>Logs and configuration</summary> [ossec_Test_stress_B4236_windows_2023-08-05.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/logs/ossec_Test_stress_B4236_windows_2023-08-05.zip) </details> + <details><summary>CSV</summary> [monitor-winagent-Test_stress_B4236_windows-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/data/monitor-winagent-Test_stress_B4236_windows-pre-release.csv) [Test_stress_B4236_windows_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.5.0/B4236-360m/B4236_agent_windows/data/Test_stress_B4236_windows_agentd_state.csv) </details> </details> <details><summary>macOS agent</summary> + <details><summary>Plots</summary> </details> + <details><summary>Logs and configuration</summary> </details> + <details><summary>CSV</summary> </details> </details> <details><summary>Solaris agent</summary> + <details><summary>Plots</summary> </details> + <details><summary>Logs and configuration</summary> </details> + <details><summary>CSV</summary> </details> </details>
non_process
release revision release candidate footprint metrics all except vulnerability detector footprint metrics information main release candidate issue main footprint metrics issue version release candidate tag stress test documentation packages used repository packages dev wazuh com package path pre release package revision jenkins build manager plots logs and configuration csv centos agent plots logs and configuration csv ubuntu agent plots logs and configuration csv windows agent plots logs and configuration csv macos agent plots logs and configuration csv solaris agent plots logs and configuration csv
0
469,989
13,529,180,566
IssuesEvent
2020-09-15 17:52:52
near/near-wallet
https://api.github.com/repos/near/near-wallet
closed
Prevent user from entering @ on all account name fields
Priority 1
### Overview Several users are unclear as to whether or not the `@` character is part of their account name. Given `@` is an invalid character for account names, we should prevent users from entering it on any account name fields. ### Acceptance Criteria - [ ] Users are unable to enter `@` in the following account name fields - [ ] `/create` - [ ] `/send-money` - [ ] `/recover-with-seed` - [ ] `/sign-in-ledger`
1.0
Prevent user from entering @ on all account name fields - ### Overview Several users are unclear as to whether or not the `@` character is part of their account name. Given `@` is an invalid character for account names, we should prevent users from entering it on any account name fields. ### Acceptance Criteria - [ ] Users are unable to enter `@` in the following account name fields - [ ] `/create` - [ ] `/send-money` - [ ] `/recover-with-seed` - [ ] `/sign-in-ledger`
non_process
prevent user from entering on all account name fields overview several users are unclear as to whether or not the character is part of their account name given is an invalid character for account names we should prevent users from entering it on any account name fields acceptance criteria users are unable to enter in the following account name fields create send money recover with seed sign in ledger
0
227,405
17,382,006,694
IssuesEvent
2021-07-31 22:56:02
laminas/laminas-cache
https://api.github.com/repos/laminas/laminas-cache
closed
Missing docs about APCU storage
Documentation Help Wanted
In the adapter overview https://github.com/zendframework/zend-cache/blob/master/docs/book/storage/adapter.md the APCU storage adapter is missing. --- Originally posted by @dol at https://github.com/zendframework/zend-cache/issues/170
1.0
Missing docs about APCU storage - In the adapter overview https://github.com/zendframework/zend-cache/blob/master/docs/book/storage/adapter.md the APCU storage adapter is missing. --- Originally posted by @dol at https://github.com/zendframework/zend-cache/issues/170
non_process
missing docs about apcu storage in the adapter overview the apcu storage adapter is missing originally posted by dol at
0
293,008
22,041,982,512
IssuesEvent
2022-05-29 13:49:03
DIT113-V22/group-12
https://api.github.com/repos/DIT113-V22/group-12
closed
Finalize features specification
documentation High Improvement
Acceptance criteria: - [x] All GitHub issues are linked to their respective milestones. - [x] Each milestone has an example and usage scenario of the requirements.
1.0
Finalize features specification - Acceptance criteria: - [x] All GitHub issues are linked to their respective milestones. - [x] Each milestone has an example and usage scenario of the requirements.
non_process
finalize features specification acceptance criteria all github issues are linked to their respective milestones each milestone has an example and usage scenario of the requirements
0
15,504
2,858,497,987
IssuesEvent
2015-06-03 03:06:12
michaelcdillon/ice4j
https://api.github.com/repos/michaelcdillon/ice4j
closed
Deadlock in PsuedoTCP
auto-migrated Priority-Medium Type-Defect
``` We are occasionally getting stuck in deadlocks when using PsuedoTCP. We haven't been able to create a an easily reproducible example but we have logged the thread-dump from one occasion when it occurs. MessageReadThread@6558 daemon, prio=5, in group 'main', status: 'MONITOR' blocks PseudoTcpReceiveThread@6559 blocks pool-40-thread-5@5853 waiting for PseudoTcpReceiveThread@6559 to release lock on <0x1f78> (a org.ice4j.pseudotcp.PseudoTCPBase) at org.ice4j.pseudotcp.PseudoTCPBase.recv(PseudoTCPBase.java:591) at org.ice4j.pseudotcp.PseudoTcpSocketImpl$PseudoTcpInputStream.read(PseudoTcpSocketImpl.java:761) at sun.security.ssl.InputRecord.readFully(InputRecord.java:442) at sun.security.ssl.InputRecord.readV3Record(InputRecord.java:554) at sun.security.ssl.InputRecord.read(InputRecord.java:509) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:927) at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:884) at sun.security.ssl.AppInputStream.read(AppInputStream.java:102) at java.io.FilterInputStream.read(FilterInputStream.java:133) at com.google.protobuf.AbstractMessageLite$Builder$LimitedInputStream.read(AbstractMessageLite.java:263) at com.google.protobuf.CodedInputStream.readRawBytes(CodedInputStream.java:851) at com.google.protobuf.CodedInputStream.readBytes(CodedInputStream.java:329) at com.degoo.protocol.ClientProtos$NetworkMessage.<init>(ClientProtos.java:8927) at com.degoo.protocol.ClientProtos$NetworkMessage.<init>(ClientProtos.java:8866) at com.degoo.protocol.ClientProtos$NetworkMessage$1.parsePartialFrom(ClientProtos.java:8960) at com.degoo.protocol.ClientProtos$NetworkMessage$1.parsePartialFrom(ClientProtos.java:8955) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) at com.degoo.protocol.ClientProtos$NetworkMessage.parseDelimitedFrom(ClientProtos.java:9133) at com.degoo.backend.ice4j.NATConnection$MessageReadThread.run(NATConnection.java:365) PseudoTcpReceiveThread@6559 daemon, prio=5, in group 'main', status: 'MONITOR' blocks PseudoTcpClockThread@6560 blocks MessageReadThread@6558 waiting for MessageReadThread@6558 to release lock on <0x1f84> (a java.lang.Object) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.releaseAllLocks(PseudoTcpSocketImpl.java:493) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.onTcpClosed(PseudoTcpSocketImpl.java:484) at org.ice4j.pseudotcp.PseudoTCPBase.closedown(PseudoTCPBase.java:1574) at org.ice4j.pseudotcp.PseudoTCPBase.process(PseudoTCPBase.java:927) at org.ice4j.pseudotcp.PseudoTCPBase.parse(PseudoTCPBase.java:852) at org.ice4j.pseudotcp.PseudoTCPBase.notifyPacket(PseudoTCPBase.java:486) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.receivePackets(PseudoTcpSocketImpl.java:572) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.access$000(PseudoTcpSocketImpl.java:18) at org.ice4j.pseudotcp.PseudoTcpSocketImpl$1.run(PseudoTcpSocketImpl.java:401) at java.lang.Thread.run(Thread.java:722) We are using revision #r351 and are running it on Java 7 and Windows 7. I'm guessing that there's some place in the code where the synchronization doesn't occur in the correct order, thus causing a circular lock dependency that creates this deadlock. ``` Original issue reported on code.google.com by `c...@degoo.com` on 4 Sep 2013 at 10:00
1.0
Deadlock in PsuedoTCP - ``` We are occasionally getting stuck in deadlocks when using PsuedoTCP. We haven't been able to create a an easily reproducible example but we have logged the thread-dump from one occasion when it occurs. MessageReadThread@6558 daemon, prio=5, in group 'main', status: 'MONITOR' blocks PseudoTcpReceiveThread@6559 blocks pool-40-thread-5@5853 waiting for PseudoTcpReceiveThread@6559 to release lock on <0x1f78> (a org.ice4j.pseudotcp.PseudoTCPBase) at org.ice4j.pseudotcp.PseudoTCPBase.recv(PseudoTCPBase.java:591) at org.ice4j.pseudotcp.PseudoTcpSocketImpl$PseudoTcpInputStream.read(PseudoTcpSocketImpl.java:761) at sun.security.ssl.InputRecord.readFully(InputRecord.java:442) at sun.security.ssl.InputRecord.readV3Record(InputRecord.java:554) at sun.security.ssl.InputRecord.read(InputRecord.java:509) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:927) at sun.security.ssl.SSLSocketImpl.readDataRecord(SSLSocketImpl.java:884) at sun.security.ssl.AppInputStream.read(AppInputStream.java:102) at java.io.FilterInputStream.read(FilterInputStream.java:133) at com.google.protobuf.AbstractMessageLite$Builder$LimitedInputStream.read(AbstractMessageLite.java:263) at com.google.protobuf.CodedInputStream.readRawBytes(CodedInputStream.java:851) at com.google.protobuf.CodedInputStream.readBytes(CodedInputStream.java:329) at com.degoo.protocol.ClientProtos$NetworkMessage.<init>(ClientProtos.java:8927) at com.degoo.protocol.ClientProtos$NetworkMessage.<init>(ClientProtos.java:8866) at com.degoo.protocol.ClientProtos$NetworkMessage$1.parsePartialFrom(ClientProtos.java:8960) at com.degoo.protocol.ClientProtos$NetworkMessage$1.parsePartialFrom(ClientProtos.java:8955) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259) at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49) at com.degoo.protocol.ClientProtos$NetworkMessage.parseDelimitedFrom(ClientProtos.java:9133) at com.degoo.backend.ice4j.NATConnection$MessageReadThread.run(NATConnection.java:365) PseudoTcpReceiveThread@6559 daemon, prio=5, in group 'main', status: 'MONITOR' blocks PseudoTcpClockThread@6560 blocks MessageReadThread@6558 waiting for MessageReadThread@6558 to release lock on <0x1f84> (a java.lang.Object) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.releaseAllLocks(PseudoTcpSocketImpl.java:493) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.onTcpClosed(PseudoTcpSocketImpl.java:484) at org.ice4j.pseudotcp.PseudoTCPBase.closedown(PseudoTCPBase.java:1574) at org.ice4j.pseudotcp.PseudoTCPBase.process(PseudoTCPBase.java:927) at org.ice4j.pseudotcp.PseudoTCPBase.parse(PseudoTCPBase.java:852) at org.ice4j.pseudotcp.PseudoTCPBase.notifyPacket(PseudoTCPBase.java:486) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.receivePackets(PseudoTcpSocketImpl.java:572) at org.ice4j.pseudotcp.PseudoTcpSocketImpl.access$000(PseudoTcpSocketImpl.java:18) at org.ice4j.pseudotcp.PseudoTcpSocketImpl$1.run(PseudoTcpSocketImpl.java:401) at java.lang.Thread.run(Thread.java:722) We are using revision #r351 and are running it on Java 7 and Windows 7. I'm guessing that there's some place in the code where the synchronization doesn't occur in the correct order, thus causing a circular lock dependency that creates this deadlock. ``` Original issue reported on code.google.com by `c...@degoo.com` on 4 Sep 2013 at 10:00
non_process
deadlock in psuedotcp we are occasionally getting stuck in deadlocks when using psuedotcp we haven t been able to create a an easily reproducible example but we have logged the thread dump from one occasion when it occurs messagereadthread daemon prio in group main status monitor blocks pseudotcpreceivethread blocks pool thread waiting for pseudotcpreceivethread to release lock on a org pseudotcp pseudotcpbase at org pseudotcp pseudotcpbase recv pseudotcpbase java at org pseudotcp pseudotcpsocketimpl pseudotcpinputstream read pseudotcpsocketimpl java at sun security ssl inputrecord readfully inputrecord java at sun security ssl inputrecord inputrecord java at sun security ssl inputrecord read inputrecord java at sun security ssl sslsocketimpl readrecord sslsocketimpl java at sun security ssl sslsocketimpl readdatarecord sslsocketimpl java at sun security ssl appinputstream read appinputstream java at java io filterinputstream read filterinputstream java at com google protobuf abstractmessagelite builder limitedinputstream read abstractmessagelite java at com google protobuf codedinputstream readrawbytes codedinputstream java at com google protobuf codedinputstream readbytes codedinputstream java at com degoo protocol clientprotos networkmessage clientprotos java at com degoo protocol clientprotos networkmessage clientprotos java at com degoo protocol clientprotos networkmessage parsepartialfrom clientprotos java at com degoo protocol clientprotos networkmessage parsepartialfrom clientprotos java at com google protobuf abstractparser parsepartialfrom abstractparser java at com google protobuf abstractparser parsepartialdelimitedfrom abstractparser java at com google protobuf abstractparser parsedelimitedfrom abstractparser java at com google protobuf abstractparser parsedelimitedfrom abstractparser java at com google protobuf abstractparser parsedelimitedfrom abstractparser java at com degoo protocol clientprotos networkmessage parsedelimitedfrom clientprotos java at com degoo backend natconnection messagereadthread run natconnection java pseudotcpreceivethread daemon prio in group main status monitor blocks pseudotcpclockthread blocks messagereadthread waiting for messagereadthread to release lock on a java lang object at org pseudotcp pseudotcpsocketimpl releasealllocks pseudotcpsocketimpl java at org pseudotcp pseudotcpsocketimpl ontcpclosed pseudotcpsocketimpl java at org pseudotcp pseudotcpbase closedown pseudotcpbase java at org pseudotcp pseudotcpbase process pseudotcpbase java at org pseudotcp pseudotcpbase parse pseudotcpbase java at org pseudotcp pseudotcpbase notifypacket pseudotcpbase java at org pseudotcp pseudotcpsocketimpl receivepackets pseudotcpsocketimpl java at org pseudotcp pseudotcpsocketimpl access pseudotcpsocketimpl java at org pseudotcp pseudotcpsocketimpl run pseudotcpsocketimpl java at java lang thread run thread java we are using revision and are running it on java and windows i m guessing that there s some place in the code where the synchronization doesn t occur in the correct order thus causing a circular lock dependency that creates this deadlock original issue reported on code google com by c degoo com on sep at
0
2,526
5,288,478,386
IssuesEvent
2017-02-08 15:14:26
QCoDeS/Qcodes
https://api.github.com/repos/QCoDeS/Qcodes
reopened
aborting a measurement
enhancement long-term mulitprocessing
Currently it is hard to abort a running measurement. Pressing CTRL-C can leave the system (in particular the instruments) in an unusable state. The FAQ (https://github.com/QCoDeS/Qcodes/blob/master/docs/user/faq.rst) only mentiones ipython notebook. The original description (https://github.com/QCoDeS/Qcodes/pull/86/files) is only valid for a background measument, which is disabled by default. PR #463 creates an abort measurement hook that can be used for proper abortion of a measurement. @giulioungaretti @jenshnielsen @WilliamHPNielsen
1.0
aborting a measurement - Currently it is hard to abort a running measurement. Pressing CTRL-C can leave the system (in particular the instruments) in an unusable state. The FAQ (https://github.com/QCoDeS/Qcodes/blob/master/docs/user/faq.rst) only mentiones ipython notebook. The original description (https://github.com/QCoDeS/Qcodes/pull/86/files) is only valid for a background measument, which is disabled by default. PR #463 creates an abort measurement hook that can be used for proper abortion of a measurement. @giulioungaretti @jenshnielsen @WilliamHPNielsen
process
aborting a measurement currently it is hard to abort a running measurement pressing ctrl c can leave the system in particular the instruments in an unusable state the faq only mentiones ipython notebook the original description is only valid for a background measument which is disabled by default pr creates an abort measurement hook that can be used for proper abortion of a measurement giulioungaretti jenshnielsen williamhpnielsen
1
5,552
8,394,068,437
IssuesEvent
2018-10-09 22:44:38
googleapis/google-cloud-java
https://api.github.com/repos/googleapis/google-cloud-java
closed
Reporting configuration should be done in <reporting> section, not in maven-site-plugin <configuration> as reportPlugins parameter.
priority: p2 status: in progress type: bug type: process
While building google-cloud-bom ``` [WARNING] [WARNING] Some problems were encountered while building the effective model for com.google.cloud:google-cloud-core:jar:1.27.1-SNAPSHOT [WARNING] Reporting configuration should be done in <reporting> section, not in maven-site-plugin <configuration> as reportPlugins parameter. [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support building such malformed projects. [WARNING] ```
1.0
Reporting configuration should be done in <reporting> section, not in maven-site-plugin <configuration> as reportPlugins parameter. - While building google-cloud-bom ``` [WARNING] [WARNING] Some problems were encountered while building the effective model for com.google.cloud:google-cloud-core:jar:1.27.1-SNAPSHOT [WARNING] Reporting configuration should be done in <reporting> section, not in maven-site-plugin <configuration> as reportPlugins parameter. [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support building such malformed projects. [WARNING] ```
process
reporting configuration should be done in section not in maven site plugin as reportplugins parameter while building google cloud bom some problems were encountered while building the effective model for com google cloud google cloud core jar snapshot reporting configuration should be done in section not in maven site plugin as reportplugins parameter it is highly recommended to fix these problems because they threaten the stability of your build for this reason future maven versions might no longer support building such malformed projects
1
21,082
28,034,528,450
IssuesEvent
2023-03-28 14:19:49
OpenEnergyPlatform/open-MaStR
https://api.github.com/repos/OpenEnergyPlatform/open-MaStR
opened
Adding validation scripts to the repository
:scissors: post processing
## Description of the issue We can enrich the validation scripts with existing ones. ## Ideas of solution Todos I can think of: - [ ] Migrating the code - [ ] Readme - [ ] Check if it works in a new environment (second person) ## Workflow checklist - [x] I am aware of the workflow in [CONTRIBUTING.md](https://github.com/OpenEnergyPlatform/open-MaStR/blob/production/CONTRIBUTING.md)
1.0
Adding validation scripts to the repository - ## Description of the issue We can enrich the validation scripts with existing ones. ## Ideas of solution Todos I can think of: - [ ] Migrating the code - [ ] Readme - [ ] Check if it works in a new environment (second person) ## Workflow checklist - [x] I am aware of the workflow in [CONTRIBUTING.md](https://github.com/OpenEnergyPlatform/open-MaStR/blob/production/CONTRIBUTING.md)
process
adding validation scripts to the repository description of the issue we can enrich the validation scripts with existing ones ideas of solution todos i can think of migrating the code readme check if it works in a new environment second person workflow checklist i am aware of the workflow in
1
98,552
11,089,690,442
IssuesEvent
2019-12-14 20:20:37
SpotlightKid/jack-matchmaker
https://api.github.com/repos/SpotlightKid/jack-matchmaker
opened
Add documentation for systemd setup
documentation
* How to install system service * How to configure as system service * How to configure as user service * Environment file settings * Pattern file
1.0
Add documentation for systemd setup - * How to install system service * How to configure as system service * How to configure as user service * Environment file settings * Pattern file
non_process
add documentation for systemd setup how to install system service how to configure as system service how to configure as user service environment file settings pattern file
0
190,764
14,577,407,051
IssuesEvent
2020-12-18 01:58:36
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
mraksoll4/lnd: nursery_store_test.go; 3 LoC
fresh test tiny
Found a possible issue in [mraksoll4/lnd](https://www.github.com/mraksoll4/lnd) at [nursery_store_test.go](https://github.com/mraksoll4/lnd/blob/e495a1057c2a4b9e3df37f2bac991cedcd64c89a/nursery_store_test.go#L159-L161) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to htlcOutput at line 160 may start a goroutine [Click here to see the code in its original context.](https://github.com/mraksoll4/lnd/blob/e495a1057c2a4b9e3df37f2bac991cedcd64c89a/nursery_store_test.go#L159-L161) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, htlcOutput := range test.htlcOutputs { assertCribAtExpiryHeight(t, ns, &htlcOutput) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: e495a1057c2a4b9e3df37f2bac991cedcd64c89a
1.0
mraksoll4/lnd: nursery_store_test.go; 3 LoC - Found a possible issue in [mraksoll4/lnd](https://www.github.com/mraksoll4/lnd) at [nursery_store_test.go](https://github.com/mraksoll4/lnd/blob/e495a1057c2a4b9e3df37f2bac991cedcd64c89a/nursery_store_test.go#L159-L161) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to htlcOutput at line 160 may start a goroutine [Click here to see the code in its original context.](https://github.com/mraksoll4/lnd/blob/e495a1057c2a4b9e3df37f2bac991cedcd64c89a/nursery_store_test.go#L159-L161) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, htlcOutput := range test.htlcOutputs { assertCribAtExpiryHeight(t, ns, &htlcOutput) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: e495a1057c2a4b9e3df37f2bac991cedcd64c89a
non_process
lnd nursery store test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to htlcoutput at line may start a goroutine click here to show the line s of go which triggered the analyzer go for htlcoutput range test htlcoutputs assertcribatexpiryheight t ns htlcoutput leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
3,203
6,262,551,033
IssuesEvent
2017-07-15 11:54:30
coala/teams
https://api.github.com/repos/coala/teams
closed
Aspects team member application: Asnel Christian
process/approved
# Bio I am Asnel Christian Ngoulla, a computer science undergraduate student from Cameroon. I love python, java, machine learning and computer graphics. I am not funny... but i like hanging around with people who are ( ;) like @adtac, @Adrianzatreanu ... :sweat_smile: ). # coala Contributions so far I have contributed some codes to coala, coala-bears and coala-utils, and must of what i did had to do with settings, bears and Requirements... # Road to the Future I would really love to see the aspects being implemented and used, and what i plan to do is help with their implementation. # Specific Responsibilities Writing tests.
1.0
Aspects team member application: Asnel Christian - # Bio I am Asnel Christian Ngoulla, a computer science undergraduate student from Cameroon. I love python, java, machine learning and computer graphics. I am not funny... but i like hanging around with people who are ( ;) like @adtac, @Adrianzatreanu ... :sweat_smile: ). # coala Contributions so far I have contributed some codes to coala, coala-bears and coala-utils, and must of what i did had to do with settings, bears and Requirements... # Road to the Future I would really love to see the aspects being implemented and used, and what i plan to do is help with their implementation. # Specific Responsibilities Writing tests.
process
aspects team member application asnel christian bio i am asnel christian ngoulla a computer science undergraduate student from cameroon i love python java machine learning and computer graphics i am not funny but i like hanging around with people who are like adtac adrianzatreanu sweat smile coala contributions so far i have contributed some codes to coala coala bears and coala utils and must of what i did had to do with settings bears and requirements road to the future i would really love to see the aspects being implemented and used and what i plan to do is help with their implementation specific responsibilities writing tests
1
9,367
12,372,659,620
IssuesEvent
2020-05-18 20:49:59
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
closed
monitoring: re-enable generation after breaking change comes through
api: monitoring type: process
There is a breaking change coming to this library. We will need to": - stop generation - Mark v1(of apiv3) as deprecated and re-enable generation under v2 after breaking change is merged
1.0
monitoring: re-enable generation after breaking change comes through - There is a breaking change coming to this library. We will need to": - stop generation - Mark v1(of apiv3) as deprecated and re-enable generation under v2 after breaking change is merged
process
monitoring re enable generation after breaking change comes through there is a breaking change coming to this library we will need to stop generation mark of as deprecated and re enable generation under after breaking change is merged
1
45,862
9,820,759,176
IssuesEvent
2019-06-14 04:16:43
WheezePuppet/specstar
https://api.github.com/repos/WheezePuppet/specstar
closed
Output number/size of graph components
analysis code phase 1
When param_sweep.jl is executed, include among the output written to files the sizes (in # of agents) of each graph component. (Probably in same file as produced by #39.)
1.0
Output number/size of graph components - When param_sweep.jl is executed, include among the output written to files the sizes (in # of agents) of each graph component. (Probably in same file as produced by #39.)
non_process
output number size of graph components when param sweep jl is executed include among the output written to files the sizes in of agents of each graph component probably in same file as produced by
0
8,352
11,502,558,749
IssuesEvent
2020-02-12 19:20:31
fraction/oasis
https://api.github.com/repos/fraction/oasis
closed
Convert roadmap into issues
process question
**What's the problem you want solved?** We have a great roadmap but there aren't Github issues for the outstanding tasks. **Is there a solution you'd like to recommend?** Convert remaining points on roadmap into issues to track progress and document roadblocks and speed bumps. Shall we rename to cycle map lol? I hate :car:
1.0
Convert roadmap into issues - **What's the problem you want solved?** We have a great roadmap but there aren't Github issues for the outstanding tasks. **Is there a solution you'd like to recommend?** Convert remaining points on roadmap into issues to track progress and document roadblocks and speed bumps. Shall we rename to cycle map lol? I hate :car:
process
convert roadmap into issues what s the problem you want solved we have a great roadmap but there aren t github issues for the outstanding tasks is there a solution you d like to recommend convert remaining points on roadmap into issues to track progress and document roadblocks and speed bumps shall we rename to cycle map lol i hate car
1
132,098
18,266,107,377
IssuesEvent
2021-10-04 08:38:48
artsking/linux-3.0.35_CVE-2020-15436_withPatch
https://api.github.com/repos/artsking/linux-3.0.35_CVE-2020-15436_withPatch
closed
CVE-2020-10769 (Medium) detected in linux-stable-rtv3.8.6 - autoclosed
security vulnerability
## CVE-2020-10769 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35_CVE-2020-15436_withPatch/commit/594a70cb9871ddd73cf61197bb1a2a1b1777a7ae">594a70cb9871ddd73cf61197bb1a2a1b1777a7ae</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A buffer over-read flaw was found in RH kernel versions before 5.0 in crypto_authenc_extractkeys in crypto/authenc.c in the IPsec Cryptographic algorithm's module, authenc. When a payload longer than 4 bytes, and is not following 4-byte alignment boundary guidelines, it causes a buffer over-read threat, leading to a system crash. This flaw allows a local attacker with user privileges to cause a denial of service. <p>Publish Date: 2020-06-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10769>CVE-2020-10769</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10769</a></p> <p>Release Date: 2020-06-26</p> <p>Fix Resolution: v5.0-rc3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-10769 (Medium) detected in linux-stable-rtv3.8.6 - autoclosed - ## CVE-2020-10769 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35_CVE-2020-15436_withPatch/commit/594a70cb9871ddd73cf61197bb1a2a1b1777a7ae">594a70cb9871ddd73cf61197bb1a2a1b1777a7ae</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/crypto/authenc.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A buffer over-read flaw was found in RH kernel versions before 5.0 in crypto_authenc_extractkeys in crypto/authenc.c in the IPsec Cryptographic algorithm's module, authenc. When a payload longer than 4 bytes, and is not following 4-byte alignment boundary guidelines, it causes a buffer over-read threat, leading to a system crash. This flaw allows a local attacker with user privileges to cause a denial of service. <p>Publish Date: 2020-06-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10769>CVE-2020-10769</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10769</a></p> <p>Release Date: 2020-06-26</p> <p>Fix Resolution: v5.0-rc3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in linux stable autoclosed cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files crypto authenc c crypto authenc c crypto authenc c vulnerability details a buffer over read flaw was found in rh kernel versions before in crypto authenc extractkeys in crypto authenc c in the ipsec cryptographic algorithm s module authenc when a payload longer than bytes and is not following byte alignment boundary guidelines it causes a buffer over read threat leading to a system crash this flaw allows a local attacker with user privileges to cause a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
112,380
24,260,380,298
IssuesEvent
2022-09-27 21:58:10
microsoft/pxt-arcade
https://api.github.com/repos/microsoft/pxt-arcade
closed
Whack the mole skillmap has grey block in the hints
hour of code
Get Animated tutorial->Step 6->hint ![image](https://user-images.githubusercontent.com/6107272/192617129-4108d101-630f-4a5c-ba3a-ad030e2b6b0d.png)
1.0
Whack the mole skillmap has grey block in the hints - Get Animated tutorial->Step 6->hint ![image](https://user-images.githubusercontent.com/6107272/192617129-4108d101-630f-4a5c-ba3a-ad030e2b6b0d.png)
non_process
whack the mole skillmap has grey block in the hints get animated tutorial step hint
0
220,042
17,146,198,932
IssuesEvent
2021-07-13 14:50:20
elastic/github-actions
https://api.github.com/repos/elastic/github-actions
closed
Project assigner cannot access projects belonging to organization
wf_test
I tried setting up the project assigner GitHub action in the `elastic/elasticsearch` repo to assign issues to [this project](https://github.com/orgs/elastic/projects/245) but I ended up getting the following error: ``` Creating a new card for open Issue #567714485 in project [Elasticsearch Build Engineering] column 8061652 mathing label [:Core/Infra/Build], labeled by mark-vieira ##[error]Error adding Issue #567714485 to project Elasticsearch Build Engineering column 8061652: Resource not accessible by integration ``` I think the issue is that the project board in question belongs to the `elastic` or, not the `elastic/elasticsearch` repo. The reason for this is because the board is linked to multiple repos as it's used to coordinate work across more than just a single repo. It would be nice for the GH action to support such a scenario if at all possible.
1.0
Project assigner cannot access projects belonging to organization - I tried setting up the project assigner GitHub action in the `elastic/elasticsearch` repo to assign issues to [this project](https://github.com/orgs/elastic/projects/245) but I ended up getting the following error: ``` Creating a new card for open Issue #567714485 in project [Elasticsearch Build Engineering] column 8061652 mathing label [:Core/Infra/Build], labeled by mark-vieira ##[error]Error adding Issue #567714485 to project Elasticsearch Build Engineering column 8061652: Resource not accessible by integration ``` I think the issue is that the project board in question belongs to the `elastic` or, not the `elastic/elasticsearch` repo. The reason for this is because the board is linked to multiple repos as it's used to coordinate work across more than just a single repo. It would be nice for the GH action to support such a scenario if at all possible.
non_process
project assigner cannot access projects belonging to organization i tried setting up the project assigner github action in the elastic elasticsearch repo to assign issues to but i ended up getting the following error creating a new card for open issue in project column mathing label labeled by mark vieira error adding issue to project elasticsearch build engineering column resource not accessible by integration i think the issue is that the project board in question belongs to the elastic or not the elastic elasticsearch repo the reason for this is because the board is linked to multiple repos as it s used to coordinate work across more than just a single repo it would be nice for the gh action to support such a scenario if at all possible
0
273,147
29,800,498,305
IssuesEvent
2023-06-16 07:44:05
billmcchesney1/foxtrot
https://api.github.com/repos/billmcchesney1/foxtrot
closed
CVE-2020-27223 (Medium) detected in jetty-http-9.4.18.v20190429.jar, jetty-http-9.4.11.v20180605.jar - autoclosed
Mend: dependency security vulnerability
## CVE-2020-27223 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jetty-http-9.4.18.v20190429.jar</b>, <b>jetty-http-9.4.11.v20180605.jar</b></p></summary> <p> <details><summary><b>jetty-http-9.4.18.v20190429.jar</b></p></summary> <p>The Eclipse Jetty Project</p> <p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p> <p>Path to dependency file: /foxtrot-translator/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar</p> <p> Dependency Hierarchy: - dropwizard-core-1.3.13.jar (Root Library) - dropwizard-jetty-1.3.13.jar - :x: **jetty-http-9.4.18.v20190429.jar** (Vulnerable Library) </details> <details><summary><b>jetty-http-9.4.11.v20180605.jar</b></p></summary> <p>The Eclipse Jetty Project</p> <p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p> <p>Path to dependency file: /foxtrot-common/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.11.v20180605/jetty-http-9.4.11.v20180605.jar</p> <p> Dependency Hierarchy: - dropwizard-swagger-1.3.7-1.jar (Root Library) - dropwizard-core-1.3.7.jar - dropwizard-jetty-1.3.7.jar - :x: **jetty-http-9.4.11.v20180605.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/foxtrot/commit/ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc">ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In Eclipse Jetty 9.4.6.v20170531 to 9.4.36.v20210114 (inclusive), 10.0.0, and 11.0.0 when Jetty handles a request containing multiple Accept headers with a large number of “quality” (i.e. q) parameters, the server may enter a denial of service (DoS) state due to high CPU usage processing those quality values, resulting in minutes of CPU time exhausted processing those quality values. <p>Publish Date: 2021-02-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-27223>CVE-2020-27223</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m394-8rww-3jr7">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m394-8rww-3jr7</a></p> <p>Release Date: 2021-02-26</p> <p>Fix Resolution (org.eclipse.jetty:jetty-http): 9.4.37.v20210219</p> <p>Direct dependency fix Resolution (io.dropwizard:dropwizard-core): 2.0.0-rc0+test8</p><p>Fix Resolution (org.eclipse.jetty:jetty-http): 9.4.37.v20210219</p> <p>Direct dependency fix Resolution (com.smoketurner:dropwizard-swagger): 2.1.4-1</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2020-27223 (Medium) detected in jetty-http-9.4.18.v20190429.jar, jetty-http-9.4.11.v20180605.jar - autoclosed - ## CVE-2020-27223 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jetty-http-9.4.18.v20190429.jar</b>, <b>jetty-http-9.4.11.v20180605.jar</b></p></summary> <p> <details><summary><b>jetty-http-9.4.18.v20190429.jar</b></p></summary> <p>The Eclipse Jetty Project</p> <p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p> <p>Path to dependency file: /foxtrot-translator/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.18.v20190429/jetty-http-9.4.18.v20190429.jar</p> <p> Dependency Hierarchy: - dropwizard-core-1.3.13.jar (Root Library) - dropwizard-jetty-1.3.13.jar - :x: **jetty-http-9.4.18.v20190429.jar** (Vulnerable Library) </details> <details><summary><b>jetty-http-9.4.11.v20180605.jar</b></p></summary> <p>The Eclipse Jetty Project</p> <p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p> <p>Path to dependency file: /foxtrot-common/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-http/9.4.11.v20180605/jetty-http-9.4.11.v20180605.jar</p> <p> Dependency Hierarchy: - dropwizard-swagger-1.3.7-1.jar (Root Library) - dropwizard-core-1.3.7.jar - dropwizard-jetty-1.3.7.jar - :x: **jetty-http-9.4.11.v20180605.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/foxtrot/commit/ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc">ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In Eclipse Jetty 9.4.6.v20170531 to 9.4.36.v20210114 (inclusive), 10.0.0, and 11.0.0 when Jetty handles a request containing multiple Accept headers with a large number of “quality” (i.e. q) parameters, the server may enter a denial of service (DoS) state due to high CPU usage processing those quality values, resulting in minutes of CPU time exhausted processing those quality values. <p>Publish Date: 2021-02-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-27223>CVE-2020-27223</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m394-8rww-3jr7">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m394-8rww-3jr7</a></p> <p>Release Date: 2021-02-26</p> <p>Fix Resolution (org.eclipse.jetty:jetty-http): 9.4.37.v20210219</p> <p>Direct dependency fix Resolution (io.dropwizard:dropwizard-core): 2.0.0-rc0+test8</p><p>Fix Resolution (org.eclipse.jetty:jetty-http): 9.4.37.v20210219</p> <p>Direct dependency fix Resolution (com.smoketurner:dropwizard-swagger): 2.1.4-1</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_process
cve medium detected in jetty http jar jetty http jar autoclosed cve medium severity vulnerability vulnerable libraries jetty http jar jetty http jar jetty http jar the eclipse jetty project library home page a href path to dependency file foxtrot translator pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty http jetty http jar home wss scanner repository org eclipse jetty jetty http jetty http jar home wss scanner repository org eclipse jetty jetty http jetty http jar home wss scanner repository org eclipse jetty jetty http jetty http jar dependency hierarchy dropwizard core jar root library dropwizard jetty jar x jetty http jar vulnerable library jetty http jar the eclipse jetty project library home page a href path to dependency file foxtrot common pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty http jetty http jar dependency hierarchy dropwizard swagger jar root library dropwizard core jar dropwizard jetty jar x jetty http jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty to inclusive and when jetty handles a request containing multiple accept headers with a large number of “quality” i e q parameters the server may enter a denial of service dos state due to high cpu usage processing those quality values resulting in minutes of cpu time exhausted processing those quality values publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org eclipse jetty jetty http direct dependency fix resolution io dropwizard dropwizard core fix resolution org eclipse jetty jetty http direct dependency fix resolution com smoketurner dropwizard swagger check this box to open an automated fix pr
0
341,996
24,724,753,222
IssuesEvent
2022-10-20 13:21:10
exasol/script-languages-developer-sandbox
https://api.github.com/repos/exasol/script-languages-developer-sandbox
closed
Mention the AMI copy function in tutorial and release notes
documentation
## Background The AMI is only available in the AWS region where the release process runs. We need to mention this fact in the tutorial and the release notes. Also we should link to the AWS docu which describes how to copy an AMI to another regions.#66 ## Acceptance Criteria 1. Mention in the tutorial and release notes the fact that the AMI is only available in `eu-central-1` 2. Link to https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/CopyingAMIs.html
1.0
Mention the AMI copy function in tutorial and release notes - ## Background The AMI is only available in the AWS region where the release process runs. We need to mention this fact in the tutorial and the release notes. Also we should link to the AWS docu which describes how to copy an AMI to another regions.#66 ## Acceptance Criteria 1. Mention in the tutorial and release notes the fact that the AMI is only available in `eu-central-1` 2. Link to https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/CopyingAMIs.html
non_process
mention the ami copy function in tutorial and release notes background the ami is only available in the aws region where the release process runs we need to mention this fact in the tutorial and the release notes also we should link to the aws docu which describes how to copy an ami to another regions acceptance criteria mention in the tutorial and release notes the fact that the ami is only available in eu central link to
0
15,732
19,907,113,736
IssuesEvent
2022-01-25 13:54:35
deepset-ai/haystack
https://api.github.com/repos/deepset-ai/haystack
closed
Adding UnlabeledTextPreprocessor
type:feature topic:preprocessing journey:advanced
To be able to utilize the existing `distil_intermediate_layers_from` method implemented in the `FARMReader` class for pretraining and not only for fine tuning, we need to have a processor that can read a dataset without labels. For this, it is necessary to implement a new `Processor`.
1.0
Adding UnlabeledTextPreprocessor - To be able to utilize the existing `distil_intermediate_layers_from` method implemented in the `FARMReader` class for pretraining and not only for fine tuning, we need to have a processor that can read a dataset without labels. For this, it is necessary to implement a new `Processor`.
process
adding unlabeledtextpreprocessor to be able to utilize the existing distil intermediate layers from method implemented in the farmreader class for pretraining and not only for fine tuning we need to have a processor that can read a dataset without labels for this it is necessary to implement a new processor
1
14,754
18,024,279,149
IssuesEvent
2021-09-17 00:56:29
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
test_filter doesn't work for Python
type: support / not a bug (process) untriaged team-Rules-Python
### Description of the problem / feature request: > test_filter doesn't filter out any test, it always runs all the tests in the test file. > However, it works in blaze. > I tried to run the exact same command in bazel and blaze. It works in blaze but not in bazel. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. > It should be reproducible with any Python project using bazel. > I was running tests for https://github.com/keras-team/keras/ > The command I use is ``` bazel test keras:backend_test --test_filter=BackendCrossEntropyLossesTest.test_binary_crossentropy_from_logits_no_warnings_test_mode_eager --test_sharding_strategy=disabled ``` > To see if it runs other tests in the file, you can just add `assert False` to any other test that not suppose to run, and see if it fails or not. ### What operating system are you running Bazel on? > Docker python3.9 ### What's the output of `bazel info release`? > ``` INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=149 INFO: Reading rc options for 'info' from /workspaces/keras/.bazelrc: Inherited 'build' options: --apple_platform_type=macos --define open_source_build=true --define=use_fast_cpp_protos=false --define=tensorflow_enable_mlir_generated_gpu_kernels=0 --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --config=short_logs --config=v2 INFO: Found applicable config definition build:short_logs in file /workspaces/keras/.bazelrc: --output_filter=DONT_MATCH_ANYTHING INFO: Found applicable config definition build:v2 in file /workspaces/keras/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 release 4.1.0 ``` ### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ? ``` git@github.com:haifeng-jin/keras.git 1b9f7e5da8a49882c560b1aaf67d1a77782080d6 1b9f7e5da8a49882c560b1aaf67d1a77782080d6 ```
1.0
test_filter doesn't work for Python - ### Description of the problem / feature request: > test_filter doesn't filter out any test, it always runs all the tests in the test file. > However, it works in blaze. > I tried to run the exact same command in bazel and blaze. It works in blaze but not in bazel. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. > It should be reproducible with any Python project using bazel. > I was running tests for https://github.com/keras-team/keras/ > The command I use is ``` bazel test keras:backend_test --test_filter=BackendCrossEntropyLossesTest.test_binary_crossentropy_from_logits_no_warnings_test_mode_eager --test_sharding_strategy=disabled ``` > To see if it runs other tests in the file, you can just add `assert False` to any other test that not suppose to run, and see if it fails or not. ### What operating system are you running Bazel on? > Docker python3.9 ### What's the output of `bazel info release`? > ``` INFO: Options provided by the client: Inherited 'common' options: --isatty=1 --terminal_columns=149 INFO: Reading rc options for 'info' from /workspaces/keras/.bazelrc: Inherited 'build' options: --apple_platform_type=macos --define open_source_build=true --define=use_fast_cpp_protos=false --define=tensorflow_enable_mlir_generated_gpu_kernels=0 --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --config=short_logs --config=v2 INFO: Found applicable config definition build:short_logs in file /workspaces/keras/.bazelrc: --output_filter=DONT_MATCH_ANYTHING INFO: Found applicable config definition build:v2 in file /workspaces/keras/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1 release 4.1.0 ``` ### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ? ``` git@github.com:haifeng-jin/keras.git 1b9f7e5da8a49882c560b1aaf67d1a77782080d6 1b9f7e5da8a49882c560b1aaf67d1a77782080d6 ```
process
test filter doesn t work for python description of the problem feature request test filter doesn t filter out any test it always runs all the tests in the test file however it works in blaze i tried to run the exact same command in bazel and blaze it works in blaze but not in bazel bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible it should be reproducible with any python project using bazel i was running tests for the command i use is bazel test keras backend test test filter backendcrossentropylossestest test binary crossentropy from logits no warnings test mode eager test sharding strategy disabled to see if it runs other tests in the file you can just add assert false to any other test that not suppose to run and see if it fails or not what operating system are you running bazel on docker what s the output of bazel info release info options provided by the client inherited common options isatty terminal columns info reading rc options for info from workspaces keras bazelrc inherited build options apple platform type macos define open source build true define use fast cpp protos false define tensorflow enable mlir generated gpu kernels define allow oversize protos true spawn strategy standalone c opt announce rc define grpc no ares true config short logs config info found applicable config definition build short logs in file workspaces keras bazelrc output filter dont match anything info found applicable config definition build in file workspaces keras bazelrc define tf api version action env behavior release what s the output of git remote get url origin git rev parse master git rev parse head git github com haifeng jin keras git
1
72,655
19,401,846,568
IssuesEvent
2021-12-19 10:21:56
neovim/neovim
https://api.github.com/repos/neovim/neovim
closed
Linux - crash after startup: PANIC: unprotected error in call to Lua API
bug build platform:linux dependencies
### Neovim version (nvim -v) NVIM v0.7.0-dev+741-gff1b0f632 ### Vim (not Nvim) behaves the same? no ### Operating system/version Arch Linux - 5.15.8-arch1-1 #1 SMP PREEMPT Tue, 14 Dec 2021 12:28:02 +0000 x86_64 GNU/Linux ### Terminal name/version alacritty 0.10.0-dev (58985a4d) ### $TERM environment variable xterm-256color ### Installation build from repo ### How to reproduce the issue `nvim -u none` <press any key> ### Expected behavior Vim starts normally and doesn't crash ### Actual behavior Vim crashes with the error `PANIC: unprotected error in call to Lua API (bad argument #3 to '?' (function expected, got nil))`
1.0
Linux - crash after startup: PANIC: unprotected error in call to Lua API - ### Neovim version (nvim -v) NVIM v0.7.0-dev+741-gff1b0f632 ### Vim (not Nvim) behaves the same? no ### Operating system/version Arch Linux - 5.15.8-arch1-1 #1 SMP PREEMPT Tue, 14 Dec 2021 12:28:02 +0000 x86_64 GNU/Linux ### Terminal name/version alacritty 0.10.0-dev (58985a4d) ### $TERM environment variable xterm-256color ### Installation build from repo ### How to reproduce the issue `nvim -u none` <press any key> ### Expected behavior Vim starts normally and doesn't crash ### Actual behavior Vim crashes with the error `PANIC: unprotected error in call to Lua API (bad argument #3 to '?' (function expected, got nil))`
non_process
linux crash after startup panic unprotected error in call to lua api neovim version nvim v nvim dev vim not nvim behaves the same no operating system version arch linux smp preempt tue dec gnu linux terminal name version alacritty dev term environment variable xterm installation build from repo how to reproduce the issue nvim u none expected behavior vim starts normally and doesn t crash actual behavior vim crashes with the error panic unprotected error in call to lua api bad argument to function expected got nil
0
36,698
9,873,023,731
IssuesEvent
2019-06-22 10:28:37
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
opened
Latest release kvm2 driver still linking to newer libvirt
area/build-release co/kvm2
Seems that the driver was **not** built using the Docker image: ``` console $ wget -q https://github.com/kubernetes/minikube/releases/download/v1.1.1/docker-machine-driver-kvm2 $ ldd docker-machine-driver-kvm2 ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt-lxc.so.0: version `LIBVIRT_LXC_2.0.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_2.2.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_3.0.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_1.3.3' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_2.0.0' not found (required by ./docker-machine-driver-kvm2) ``` It is supposed to still be using libvirt 1.3.1, for compatibility. `FROM gcr.io/gcp-runtimes/ubuntu_16_0_4`
1.0
Latest release kvm2 driver still linking to newer libvirt - Seems that the driver was **not** built using the Docker image: ``` console $ wget -q https://github.com/kubernetes/minikube/releases/download/v1.1.1/docker-machine-driver-kvm2 $ ldd docker-machine-driver-kvm2 ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt-lxc.so.0: version `LIBVIRT_LXC_2.0.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_2.2.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_3.0.0' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_1.3.3' not found (required by ./docker-machine-driver-kvm2) ./docker-machine-driver-kvm2: /usr/lib/x86_64-linux-gnu/libvirt.so.0: version `LIBVIRT_2.0.0' not found (required by ./docker-machine-driver-kvm2) ``` It is supposed to still be using libvirt 1.3.1, for compatibility. `FROM gcr.io/gcp-runtimes/ubuntu_16_0_4`
non_process
latest release driver still linking to newer libvirt seems that the driver was not built using the docker image console wget q ldd docker machine driver docker machine driver usr lib linux gnu libvirt lxc so version libvirt lxc not found required by docker machine driver docker machine driver usr lib linux gnu libvirt so version libvirt not found required by docker machine driver docker machine driver usr lib linux gnu libvirt so version libvirt not found required by docker machine driver docker machine driver usr lib linux gnu libvirt so version libvirt not found required by docker machine driver docker machine driver usr lib linux gnu libvirt so version libvirt not found required by docker machine driver it is supposed to still be using libvirt for compatibility from gcr io gcp runtimes ubuntu
0
16,274
20,871,820,813
IssuesEvent
2022-03-22 12:41:22
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
QGIS crashes when I open excel files using openpyxl model in Processing script
Feedback Processing Bug
IMPORTANT: QGIS exit when i open excle file use openpyxl model in "Processing Script fremwork". I just open a excle file. My Code def processAlgorithm(self, parameters, context, feedback): """ Here is where the processing itself takes place. """ pExlFile = openpyxl.load_workbook("D:\Config.xlsx") pSheet = pExlFile["Sheet"] pStr = pSheet['B3'] QGIS Unexceptedly ended. Crash ID: 8ed7476273d654ba53319b620c01d1336f6ba6f3 Stack Trace QObject::thread : QgsProject::writeProject : _CallSettingFrame handlers.asm:50 __CxxCallCatchBlock frame.cpp:1322 RcConsolidateFrames : QgsProcessingAlgorithm::runPrepared : QgsProcessingAlgRunnerTask::run : PyInit__core : QgsTask::start : QThreadPoolPrivate::reset : QThread::start : BaseThreadInitThunk : RtlUserThreadStart :
1.0
QGIS crashes when I open excel files using openpyxl model in Processing script - IMPORTANT: QGIS exit when i open excle file use openpyxl model in "Processing Script fremwork". I just open a excle file. My Code def processAlgorithm(self, parameters, context, feedback): """ Here is where the processing itself takes place. """ pExlFile = openpyxl.load_workbook("D:\Config.xlsx") pSheet = pExlFile["Sheet"] pStr = pSheet['B3'] QGIS Unexceptedly ended. Crash ID: 8ed7476273d654ba53319b620c01d1336f6ba6f3 Stack Trace QObject::thread : QgsProject::writeProject : _CallSettingFrame handlers.asm:50 __CxxCallCatchBlock frame.cpp:1322 RcConsolidateFrames : QgsProcessingAlgorithm::runPrepared : QgsProcessingAlgRunnerTask::run : PyInit__core : QgsTask::start : QThreadPoolPrivate::reset : QThread::start : BaseThreadInitThunk : RtlUserThreadStart :
process
qgis crashes when i open excel files using openpyxl model in processing script important qgis exit when i open excle file use openpyxl model in processing script fremwork i just open a excle file my code def processalgorithm self parameters context feedback here is where the processing itself takes place pexlfile openpyxl load workbook d config xlsx psheet pexlfile pstr psheet qgis unexceptedly ended crash id stack trace qobject thread qgsproject writeproject callsettingframe handlers asm cxxcallcatchblock frame cpp rcconsolidateframes qgsprocessingalgorithm runprepared qgsprocessingalgrunnertask run pyinit core qgstask start qthreadpoolprivate reset qthread start basethreadinitthunk rtluserthreadstart
1
9,439
24,590,241,083
IssuesEvent
2022-10-14 00:58:59
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
Write e2e test for replaceCoreV1NamespacedServiceAccount - +1 Endpoint
sig/testing sig/architecture area/conformance triage/accepted
# Progress <code>[4/6]</code> - [X] APISnoop org-flow: [CoreV1ServiceAccountReplaceTest.org](https://github.com/apisnoop/ticket-writing/blob/master/CoreV1ServiceAccountReplaceTest.org) - [X] Test approval issue: [#](https://issues.k8s.io/) - [X] Test PR: [#](https://pr.k8s.io/) - [x] Two weeks soak start date: [testgrid-link](https://testgrid.k8s.io/sig-release-master-blocking#gce-cos-master-default&width=5&graph-metrics=test-duration-minutes&include-filter-by-regex=should%20update%20a%20ServiceAccount) 5 Oct 2022 - [ ] Two weeks soak end date: 19 Oct 2022 - [ ] Test promotion PR: [#](https://pr.k8s.io/) # Identifying an untested feature Using APISnoop According to this APIsnoop query, there are still three remaining LimitRange endpoints which are untested. ```sql-mode SELECT endpoint, path, kind FROM testing.untested_stable_endpoint where eligible is true and endpoint ilike '%ServiceAccount%' order by kind, endpoint desc limit 10; ``` ```example endpoint | path | kind ---------------------------------------+-------------------------------------------------------+---------------- replaceCoreV1NamespacedServiceAccount | /api/v1/namespaces/{namespace}/serviceaccounts/{name} | ServiceAccount (1 row) ``` # API Reference and feature documentation - [Kubernetes API Reference Docs](https://kubernetes.io/docs/reference/kubernetes-api/) - [Kubernetes API / Authentication Resources / ServiceAccount](https://kubernetes.io/docs/reference/kubernetes-api/authentication-resources/service-account-v1/) - [client-go - ServiceAccount](https://github.com/kubernetes/client-go/blob/master/kubernetes/typed/core/v1/serviceaccount.go) # Test outline ``` Feature: Test replace ServiceAccount api endpoint ``` - replaceCoreV1NamespacedServiceAccount ``` Scenario: confirm that the replace action will apply to a ServiceAccount Given the e2e test has created a ServiceAccount And AutomountServiceAccountToken setting is updated to true from false When the test updates the ServiceAccount Then the requested action is accepted without any error And the AutomountServiceAccountToken is found to be true ``` # E2E Test Using a number of existing e2e test practices a new [ginkgo test](https://github.com/ii/kubernetes/blob/create-service-account-replace-test/test/e2e/auth/service_accounts.go#L803-L833) has been created for one ServiceAccount endpoint. The e2e logs for this test are listed below. ``` [It] should update a ServiceAccount /home/ii/go/src/k8s.io/kubernetes/test/e2e/auth/service_accounts.go:803 STEP: Creating ServiceAccount "e2e-sa-kdl2c" 10/03/22 12:36:45.522 Oct 3 12:36:45.527: INFO: AutomountServiceAccountToken: false STEP: Updating ServiceAccount "e2e-sa-kdl2c" 10/03/22 12:36:45.527 Oct 3 12:36:45.533: INFO: AutomountServiceAccountToken: true ``` # Verifying increase in coverage with APISnoop This query shows which ServiceAccount endpoints are hit within a short period of running this e2e test ```sql-mode select distinct substring(endpoint from '\w+') AS endpoint, right(useragent,30) AS useragent from testing.audit_event where endpoint ilike '%ServiceAccount%' and release_date::BIGINT > round(((EXTRACT(EPOCH FROM NOW()))::numeric)*1000,0) - 60000 and useragent like 'e2e%should%' order by endpoint limit 10; ``` ```example endpoint | useragent ---------------------------------------+-------------------------------- createCoreV1NamespacedServiceAccount | should update a ServiceAccount listCoreV1NamespacedServiceAccount | should update a ServiceAccount readCoreV1NamespacedServiceAccount | should update a ServiceAccount replaceCoreV1NamespacedServiceAccount | should update a ServiceAccount (4 rows) ``` https://testgrid.k8s.io/sig-release-master-blocking#gce-cos-master-default&width=5&graph-metrics=test-duration-minutes&include-filter-by-regex=should%20update%20a%20ServiceAccount # Final notes If a test with these calls gets merged, **test coverage will go up by 1 point** This test is also created with the goal of conformance promotion. --- /sig testing /sig architecture /area conformance
1.0
Write e2e test for replaceCoreV1NamespacedServiceAccount - +1 Endpoint - # Progress <code>[4/6]</code> - [X] APISnoop org-flow: [CoreV1ServiceAccountReplaceTest.org](https://github.com/apisnoop/ticket-writing/blob/master/CoreV1ServiceAccountReplaceTest.org) - [X] Test approval issue: [#](https://issues.k8s.io/) - [X] Test PR: [#](https://pr.k8s.io/) - [x] Two weeks soak start date: [testgrid-link](https://testgrid.k8s.io/sig-release-master-blocking#gce-cos-master-default&width=5&graph-metrics=test-duration-minutes&include-filter-by-regex=should%20update%20a%20ServiceAccount) 5 Oct 2022 - [ ] Two weeks soak end date: 19 Oct 2022 - [ ] Test promotion PR: [#](https://pr.k8s.io/) # Identifying an untested feature Using APISnoop According to this APIsnoop query, there are still three remaining LimitRange endpoints which are untested. ```sql-mode SELECT endpoint, path, kind FROM testing.untested_stable_endpoint where eligible is true and endpoint ilike '%ServiceAccount%' order by kind, endpoint desc limit 10; ``` ```example endpoint | path | kind ---------------------------------------+-------------------------------------------------------+---------------- replaceCoreV1NamespacedServiceAccount | /api/v1/namespaces/{namespace}/serviceaccounts/{name} | ServiceAccount (1 row) ``` # API Reference and feature documentation - [Kubernetes API Reference Docs](https://kubernetes.io/docs/reference/kubernetes-api/) - [Kubernetes API / Authentication Resources / ServiceAccount](https://kubernetes.io/docs/reference/kubernetes-api/authentication-resources/service-account-v1/) - [client-go - ServiceAccount](https://github.com/kubernetes/client-go/blob/master/kubernetes/typed/core/v1/serviceaccount.go) # Test outline ``` Feature: Test replace ServiceAccount api endpoint ``` - replaceCoreV1NamespacedServiceAccount ``` Scenario: confirm that the replace action will apply to a ServiceAccount Given the e2e test has created a ServiceAccount And AutomountServiceAccountToken setting is updated to true from false When the test updates the ServiceAccount Then the requested action is accepted without any error And the AutomountServiceAccountToken is found to be true ``` # E2E Test Using a number of existing e2e test practices a new [ginkgo test](https://github.com/ii/kubernetes/blob/create-service-account-replace-test/test/e2e/auth/service_accounts.go#L803-L833) has been created for one ServiceAccount endpoint. The e2e logs for this test are listed below. ``` [It] should update a ServiceAccount /home/ii/go/src/k8s.io/kubernetes/test/e2e/auth/service_accounts.go:803 STEP: Creating ServiceAccount "e2e-sa-kdl2c" 10/03/22 12:36:45.522 Oct 3 12:36:45.527: INFO: AutomountServiceAccountToken: false STEP: Updating ServiceAccount "e2e-sa-kdl2c" 10/03/22 12:36:45.527 Oct 3 12:36:45.533: INFO: AutomountServiceAccountToken: true ``` # Verifying increase in coverage with APISnoop This query shows which ServiceAccount endpoints are hit within a short period of running this e2e test ```sql-mode select distinct substring(endpoint from '\w+') AS endpoint, right(useragent,30) AS useragent from testing.audit_event where endpoint ilike '%ServiceAccount%' and release_date::BIGINT > round(((EXTRACT(EPOCH FROM NOW()))::numeric)*1000,0) - 60000 and useragent like 'e2e%should%' order by endpoint limit 10; ``` ```example endpoint | useragent ---------------------------------------+-------------------------------- createCoreV1NamespacedServiceAccount | should update a ServiceAccount listCoreV1NamespacedServiceAccount | should update a ServiceAccount readCoreV1NamespacedServiceAccount | should update a ServiceAccount replaceCoreV1NamespacedServiceAccount | should update a ServiceAccount (4 rows) ``` https://testgrid.k8s.io/sig-release-master-blocking#gce-cos-master-default&width=5&graph-metrics=test-duration-minutes&include-filter-by-regex=should%20update%20a%20ServiceAccount # Final notes If a test with these calls gets merged, **test coverage will go up by 1 point** This test is also created with the goal of conformance promotion. --- /sig testing /sig architecture /area conformance
non_process
write test for endpoint progress apisnoop org flow test approval issue test pr two weeks soak start date oct two weeks soak end date oct test promotion pr identifying an untested feature using apisnoop according to this apisnoop query there are still three remaining limitrange endpoints which are untested sql mode select endpoint path kind from testing untested stable endpoint where eligible is true and endpoint ilike serviceaccount order by kind endpoint desc limit example endpoint path kind api namespaces namespace serviceaccounts name serviceaccount row api reference and feature documentation test outline feature test replace serviceaccount api endpoint scenario confirm that the replace action will apply to a serviceaccount given the test has created a serviceaccount and automountserviceaccounttoken setting is updated to true from false when the test updates the serviceaccount then the requested action is accepted without any error and the automountserviceaccounttoken is found to be true test using a number of existing test practices a new has been created for one serviceaccount endpoint the logs for this test are listed below should update a serviceaccount home ii go src io kubernetes test auth service accounts go step creating serviceaccount sa oct info automountserviceaccounttoken false step updating serviceaccount sa oct info automountserviceaccounttoken true verifying increase in coverage with apisnoop this query shows which serviceaccount endpoints are hit within a short period of running this test sql mode select distinct substring endpoint from w as endpoint right useragent as useragent from testing audit event where endpoint ilike serviceaccount and release date bigint round extract epoch from now numeric and useragent like should order by endpoint limit example endpoint useragent should update a serviceaccount should update a serviceaccount should update a serviceaccount should update a serviceaccount rows final notes if a test with these calls gets merged test coverage will go up by point this test is also created with the goal of conformance promotion sig testing sig architecture area conformance
0
2,372
5,172,525,308
IssuesEvent
2017-01-18 13:49:34
openvstorage/framework
https://api.github.com/repos/openvstorage/framework
closed
Do we need a longer retry period for service restarts?
process_wontfix type_question
What is the time window the current upstart/systemd configuration will retry to restart a service. I am here referring to the alba proxy issue in https://github.com/openvstorage/alba/issues/384 but it probably counts for others as well
1.0
Do we need a longer retry period for service restarts? - What is the time window the current upstart/systemd configuration will retry to restart a service. I am here referring to the alba proxy issue in https://github.com/openvstorage/alba/issues/384 but it probably counts for others as well
process
do we need a longer retry period for service restarts what is the time window the current upstart systemd configuration will retry to restart a service i am here referring to the alba proxy issue in but it probably counts for others as well
1
3,985
6,916,573,206
IssuesEvent
2017-11-29 03:22:35
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
ProcessStartInfo doesn't pass empty string to the target process
area-System.Diagnostics.Process
Following code is to pass "test1" and an empty string as arguments to netcoretest application. It works well on Windows. On Ubuntu (16.10 and 17.04), it just passes "test1" to netcoretest, and ignores the empty string. Could anyone help to take a look? ``` ProcessStartInfo psi = new ProcessStartInfo("/usr/bin/dotnet"); psi.Arguments = "netcoretest.dll test1 \"\""; Process p1 = Process.Start(psi); p1.OutputDataReceived += (s, eventArg) => { Console.WriteLine("P1 {0}", eventArg.Data.ToString()); }; ```
1.0
ProcessStartInfo doesn't pass empty string to the target process - Following code is to pass "test1" and an empty string as arguments to netcoretest application. It works well on Windows. On Ubuntu (16.10 and 17.04), it just passes "test1" to netcoretest, and ignores the empty string. Could anyone help to take a look? ``` ProcessStartInfo psi = new ProcessStartInfo("/usr/bin/dotnet"); psi.Arguments = "netcoretest.dll test1 \"\""; Process p1 = Process.Start(psi); p1.OutputDataReceived += (s, eventArg) => { Console.WriteLine("P1 {0}", eventArg.Data.ToString()); }; ```
process
processstartinfo doesn t pass empty string to the target process following code is to pass and an empty string as arguments to netcoretest application it works well on windows on ubuntu and it just passes to netcoretest and ignores the empty string could anyone help to take a look processstartinfo psi new processstartinfo usr bin dotnet psi arguments netcoretest dll process process start psi outputdatareceived s eventarg console writeline eventarg data tostring
1
20,928
31,713,576,092
IssuesEvent
2023-09-09 15:24:14
Layers-of-Railways/Railway
https://api.github.com/repos/Layers-of-Railways/Railway
closed
Game crashes when using radial menu with Steam_Rails-1.5.0+fabric-mc1.19.2-build.14
version: 1.19 type: bug flag: crash area: compatibility
### Describe the Bug When opening the radial menu to select a bogey type, when the cursor is hovered over any of the options in the menu the game crashes. The game will not crash until a menu item is hovered over however. [crash-2023-08-28_15.01.06-client.txt](https://github.com/Layers-of-Railways/Railway/files/12457593/crash-2023-08-28_15.01.06-client.txt) [latest.log](https://github.com/Layers-of-Railways/Railway/files/12457594/latest.log) ### Reproduction Steps 1. Hold a train casing and open the radial menu 2. Move mouse to hover over the menu ### Expected Result The game should not crash ### Screenshots and Videos _No response_ ### Crash Report or Log https://pastebin.com/6JqZbYDz ### Operating System Windows 10 ### Mod Version 1.5.0 ### Create Mod Version 0.5.1c ### Minecraft Version 1.19.2 ### ModLoader and Version Fabric 0.14.22 ### Other Mods accurateblockplacement: Accurate Block Placement 1.0.15 advdebug: Advancements Debug 2.3.0 applecrates: Apple Crates 2.8.5 appleskin: AppleSkin 2.4.1+mc1.19 architects_palette: Architect's Palette Fabric 3.0.0 architectury: Architectury 6.5.85 attributefix: AttributeFix 17.2.6 audioplayer: AudioPlayer 1.19.2-1.4.5 bcc: BetterCompatibilityChecker 2.0.2-build.16+mc1.19.1 bclib: BCLib 2.1.7 beaconoverhaul: Beacon Overhaul 1.7.3+1.19.2 beautify: Beautify 1.1.1+fabric-1.19.2 bee_info: More Bee info 1.1.2 beekeeperhut: Friends&Foes - Beekeeper Hut 1.2.0 betterend: Better End 2.1.6 betterf3: BetterF3 4.0.0 bettermounthud: Better Mount HUD 1.2.0 betterpingdisplay: Better Ping Display 1.1.1 betterthirdperson: Better Third Person 1.9.0 bettertridents: Better Tridents 4.0.2 biomemakeover: Biome Makeover 1.19.2-1.6.4 block_limit_fix: Block Limit Fix 1.0.3-fabric blockus: Blockus 2.5.10+1.19.2 blur: Blur (Fabric) 2.6.0 bookshelf: Bookshelf 16.3.20 brazier: Brazier 5.0.0 brewinandchewin: Brewin And Chewin fabric-2.1.5+1.19.2 bwncr: Bad Wither No Cookie Reloaded 3.14.1 byg: Oh The Biomes You'll Go 2.0.0.13 cccbridge: CC:C Bridge v1.5.1-fabric charm: Charm 4.4.4 charmofundying: Charm of Undying 6.2.0+1.19.2 chefsdelight: Chef's Delight 1.0.3-fabric-1.19.2 chipped: Chipped 2.1.4 citresewn: CIT Resewn 1.1.2+1.19.2 cleancut: CleanCut 1.19.2-5.1-fabric cloth-config: Cloth Config v8 8.2.88 clumps: Clumps 9.0.0+14 collective: Collective 6.53 colorfulazaleas: Colorful Azaleas 2.3.1 comforts: Comforts 6.0.4+1.19.2 computercraft: CC: Restitched 1.101.2 configured: Configured 2.0.0 connectiblechains: Connectible Chains 2.1.4+1.19.2 connectivity: Connectivity Mod 1.19.2-4.2 controlling: Controlling For Fabric 10.0+7 cosmetic-armor: Cosmetic Armor 1.4.2 crafttweaker: CraftTweaker 10.1.45 crawl: Crawl 0.11.1 create: Create 0.5.1-c-build.1160+mc1.19.2 create_enchantment_industry: Create Enchantment Industry 1.0.1 create_so: Create: Sandpaper Overhaul 1.5+fabric-1.19.2 createaddition: Create Crafts & Additions 20230723a createbigcannons: Create Big Cannons 0.5.2-nightly-77aa315 createdeco: Create Deco 1.3.3-1.19.2 creategoggles: Create Goggles 0.5.5.c creativecore: CreativeCore (Fabric) 2.9.3 ctm: ConnectedTexturesMod for Fabric 1.0.1+1.19 culturaldelights: Cultural Delights Fabric 0.14.9+1.19.2 customizableelytra: Customizable Elytra 1.6.4-1.19 decorative_blocks: Decorative Blocks 3.0.0 deeperdarker: Deeper and Darker 1.1.6 delightfulcreators: Delightful Creators 1.1.7 discontinuous_beacon_beams: Discontinuous Beacon Beams 1.1.7 doapi: Lets Do Api 1.1.0 dummmmmmy: MmmMmmMmmMmm 1.19.2-1.7.1 easyanvils: Easy Anvils 4.0.11 easymagic: Easy Magic 4.3.3 easyshulkerboxes: Easy Shulker Boxes 4.4.1 ecologics: Ecologics 2.1.11 effective: Effective 1.4 enchantedshulkers: Enchanted Shulkers 1.0.4 enchdesc: EnchantmentDescriptions 13.0.14 enchlevel-langpatch: Enchantment Level Language Patch 2.0.2 entity_texture_features: Entity Texture Features 4.3.3 essential: Essential 13437+deploy-staging+gd2fc09762 essential-container: essential-container 1.0.0 everycomp: Every Compat 1.19.2-2.5.1 expandability: ExpandAbility 7.0.0 expandeddelight: Expanded Delight 0.2.5.2 expandedstorage: Expanded Storage 8.3.4 fabric-api: Fabric API 0.76.0+1.19.2 fabric-language-kotlin: Fabric Language Kotlin 1.9.5+kotlin.1.8.22 fabric-ofcapes: OF Capes 2.0.0 fabricloader: Fabric Loader 0.14.22 fallingleaves: Falling Leaves 1.13.0+1.19.2 farmersdelight: Farmer's Delight 1.19.2-1.3.10 farsight: Farsight Mod 1.19-2.4 faux-custom-entity-data: Faux-Custom-Entity-Data 2.0.2 ferritecore: FerriteCore 5.0.3 forgeconfigapiport: Forge Config API Port 4.2.11 frame: Frame 0.26.1+1.19-fabric friendsandfoes: Friends&Foes 1.8.2 galosphere: Galosphere 1.19.2-1.2.2 geckolib3: Geckolib 3.1.40 goodall: Goodall 1.2.0 handcrafted: Handcrafted 2.0.6 iceberg: Iceberg 1.0.46 immersive_weathering: Immersive Weathering 1.19.2-1.2.9 indium: Indium 1.0.9+mc1.19.2 infinitybuttons: Infinity Buttons 3.1.0-mc1.19.2 ingredient-extension-api: Ingredient Extension API 3.0.6 jade: Jade 8.7.3 jadeaddons: Jade Addons 3.2.0 java: OpenJDK 64-Bit Server VM 17 konkrete: Konkrete 1.6.1 kubejs: KubeJS 1902.6.0-build.142 lambdynlights: LambDynamicLights 2.2.0+1.19.2 langpatch-conf4: Enchantment Level Language Patch Config 4.0.0 lazydfu: LazyDFU 0.1.3 led: Light Emitting Diode 1.3.0 libipn: libIPN 3.0.2 lithium: Lithium 0.11.1 luggage: Luggage 1.19-1.5.2 mcwdoors: Macaw's Doors 1.0.8 mcwtrpdoors: Macaw's Trapdoors 1.1.1 mcwwindows: Macaw's Windows 2.1.2 meadow: Meadow 1.2.0 measurements: Measurements 1.3.1 megaparrot: Megaparrot 1.0.7-1.19.2 memoryusagescreen: Memory Usage Screen 1.6 minecraft: Minecraft 1.19.2 minegate-moreblocks: MineGate · MoreBlocks 1.2.5 modelfix: Model Gap Fix 1.8 modmenu: Mod Menu 4.1.2 moonlight: Moonlight 1.19.2-2.2.44 mousetweaks: Mouse Tweaks 2.22 mousewheelie: Mouse Wheelie 1.10.7+mc1.19.2 naturescompass: Nature's Compass 1.19.2-2.1.0-fabric nethersdelight: Nether's Delight 1.0.1 no_more_purple: No More Purple 1.0.1 notenoughcrashes: Not Enough Crashes 4.2.1+1.19.2 onsoulfire: On Soul Fire 1.19-2 org_jetbrains_annotations: annotations 13.0 owo: oωo 0.8.5+1.19 patchouli: Patchouli 1.19.2-77-FABRIC pathunderfencegates: Path Under Fence Gates 1.3.0 plushies: Plushie Mod 1.2 polymorph: Polymorph 0.46.1+1.19.2 ponderjs: PonderJS 1.1.11 puzzleslib: Puzzles Lib 4.4.0 quickshulker: Quick Shulker 1.3.9-1.19 railways: Create: Steam 'n' Rails 1.5.0+fabric-mc1.19.2-build.14 reeses-sodium-options: Reese's Sodium Options 1.4.9+mc1.19.2-build.67 resourcefullib: Resourceful Lib 1.1.24 respitecreators: Respite Creators 1.2.0 rhino: Rhino 1902.2.2-build.268 rottencreatures: Rotten Creatures 1.0.1 roughly_enough_loot_tables: Roughly Enough Loot Tables 1.19-1.0 roughly_enough_trades: Roughly Enough Trades 1.19-1.0 roughlyenoughitems: Roughly Enough Items 9.1.615 roughlyenoughprofessions: Roughly Enough Professions 1.1.4 roughlyenoughresources: Roughly Enough Resources 2.6.0 satin: Satin 1.9.0 showmeyourskin: Show Me Your Skin! 1.6.3+1.19.2 shulkerboxtooltip: Shulker Box Tooltip 3.2.2+1.19.2 skinlayers: 3d Skin Layers 1.5.2-mc1.19.1 skylorlib: SkyLib 1.5.1 sliceanddice: Create Slice & Dice 2.3.1 smoothboot: Smooth Boot 1.19-1.7.1 smoothchunk: Smooth chunk save Mod 1.19.1-2.0 sodium: Sodium 0.4.4+build.18 sodium-extra: Sodium Extra 0.4.16+mc1.19.2-build.90 starlight: Starlight 1.1.1+fabric.ae22326 statement: Statement 4.2.5+1.14.4-1.19.3 steel: Steel 1.3.1+1.19.2 strawstatues: Straw Statues 4.0.10 structory: Structory 1.0.1 supplementaries: Supplementaries 1.19.2-2.3.17 swampier_swamps: Swampier Swamps 1.19-1.2.1 terrablender: TerraBlender 2.0.1.136 thonkutil: ThonkUtil 2.15.4+1.19 tipthescales: Tip The Scales 6.0.10 toolstats: ToolStats 12.0.2 trade_cycling: Trade Cycling 1.19.2-1.0.5 tradingpost: Trading Post 4.2.0 transparent: Transparent 5.1.2 trinkets: Trinkets 3.4.2 twigs: Twigs 3.0.0 universalcraft: UniversalCraft 277 v_slab_compat: v_slab_compat 1.19.2-1.4 vigilance: Vigilance 284 villagersplus: Villagers Plus 1.9 vinery: Vinery 1.3.9 visuality: Visuality 0.5.6 voicechat: Simple Voice Chat 1.19.2-2.4.10 wiredredstone: Wired Redstone 0.4.19+1.19.2 wondrouswilds: Wondrous Wilds 1.19.2-1.1.6 worldedit: WorldEdit 7.2.12+6240-87f4ae1 xaerominimap: Xaero's Minimap 23.4.4 xaeroworldmap: Xaero's World Map 1.30.3 xercapaint: Joy of Painting fabric-1.19.2-1.0.2 xlpackets: XLPackets 1.19.2-4 yet-another-config-lib: YetAnotherConfigLib 2.2.0-for-1.19.2 ### Additional Context In the Mod Version drop down menu, 1.5.0 is not an option. I used that version of Railway to reproduce the bug, not 1.4.3. I have a download for the modpack here: https://www.mediafire.com/file/3k4fzf0syqhag00/Clockwork_S2_Modpack_v1.16.zip/file Create and Steam and Rails will both have to be updated to 1.5.1c and 1.5.0 respectively in that download however.
True
Game crashes when using radial menu with Steam_Rails-1.5.0+fabric-mc1.19.2-build.14 - ### Describe the Bug When opening the radial menu to select a bogey type, when the cursor is hovered over any of the options in the menu the game crashes. The game will not crash until a menu item is hovered over however. [crash-2023-08-28_15.01.06-client.txt](https://github.com/Layers-of-Railways/Railway/files/12457593/crash-2023-08-28_15.01.06-client.txt) [latest.log](https://github.com/Layers-of-Railways/Railway/files/12457594/latest.log) ### Reproduction Steps 1. Hold a train casing and open the radial menu 2. Move mouse to hover over the menu ### Expected Result The game should not crash ### Screenshots and Videos _No response_ ### Crash Report or Log https://pastebin.com/6JqZbYDz ### Operating System Windows 10 ### Mod Version 1.5.0 ### Create Mod Version 0.5.1c ### Minecraft Version 1.19.2 ### ModLoader and Version Fabric 0.14.22 ### Other Mods accurateblockplacement: Accurate Block Placement 1.0.15 advdebug: Advancements Debug 2.3.0 applecrates: Apple Crates 2.8.5 appleskin: AppleSkin 2.4.1+mc1.19 architects_palette: Architect's Palette Fabric 3.0.0 architectury: Architectury 6.5.85 attributefix: AttributeFix 17.2.6 audioplayer: AudioPlayer 1.19.2-1.4.5 bcc: BetterCompatibilityChecker 2.0.2-build.16+mc1.19.1 bclib: BCLib 2.1.7 beaconoverhaul: Beacon Overhaul 1.7.3+1.19.2 beautify: Beautify 1.1.1+fabric-1.19.2 bee_info: More Bee info 1.1.2 beekeeperhut: Friends&Foes - Beekeeper Hut 1.2.0 betterend: Better End 2.1.6 betterf3: BetterF3 4.0.0 bettermounthud: Better Mount HUD 1.2.0 betterpingdisplay: Better Ping Display 1.1.1 betterthirdperson: Better Third Person 1.9.0 bettertridents: Better Tridents 4.0.2 biomemakeover: Biome Makeover 1.19.2-1.6.4 block_limit_fix: Block Limit Fix 1.0.3-fabric blockus: Blockus 2.5.10+1.19.2 blur: Blur (Fabric) 2.6.0 bookshelf: Bookshelf 16.3.20 brazier: Brazier 5.0.0 brewinandchewin: Brewin And Chewin fabric-2.1.5+1.19.2 bwncr: Bad Wither No Cookie Reloaded 3.14.1 byg: Oh The Biomes You'll Go 2.0.0.13 cccbridge: CC:C Bridge v1.5.1-fabric charm: Charm 4.4.4 charmofundying: Charm of Undying 6.2.0+1.19.2 chefsdelight: Chef's Delight 1.0.3-fabric-1.19.2 chipped: Chipped 2.1.4 citresewn: CIT Resewn 1.1.2+1.19.2 cleancut: CleanCut 1.19.2-5.1-fabric cloth-config: Cloth Config v8 8.2.88 clumps: Clumps 9.0.0+14 collective: Collective 6.53 colorfulazaleas: Colorful Azaleas 2.3.1 comforts: Comforts 6.0.4+1.19.2 computercraft: CC: Restitched 1.101.2 configured: Configured 2.0.0 connectiblechains: Connectible Chains 2.1.4+1.19.2 connectivity: Connectivity Mod 1.19.2-4.2 controlling: Controlling For Fabric 10.0+7 cosmetic-armor: Cosmetic Armor 1.4.2 crafttweaker: CraftTweaker 10.1.45 crawl: Crawl 0.11.1 create: Create 0.5.1-c-build.1160+mc1.19.2 create_enchantment_industry: Create Enchantment Industry 1.0.1 create_so: Create: Sandpaper Overhaul 1.5+fabric-1.19.2 createaddition: Create Crafts & Additions 20230723a createbigcannons: Create Big Cannons 0.5.2-nightly-77aa315 createdeco: Create Deco 1.3.3-1.19.2 creategoggles: Create Goggles 0.5.5.c creativecore: CreativeCore (Fabric) 2.9.3 ctm: ConnectedTexturesMod for Fabric 1.0.1+1.19 culturaldelights: Cultural Delights Fabric 0.14.9+1.19.2 customizableelytra: Customizable Elytra 1.6.4-1.19 decorative_blocks: Decorative Blocks 3.0.0 deeperdarker: Deeper and Darker 1.1.6 delightfulcreators: Delightful Creators 1.1.7 discontinuous_beacon_beams: Discontinuous Beacon Beams 1.1.7 doapi: Lets Do Api 1.1.0 dummmmmmy: MmmMmmMmmMmm 1.19.2-1.7.1 easyanvils: Easy Anvils 4.0.11 easymagic: Easy Magic 4.3.3 easyshulkerboxes: Easy Shulker Boxes 4.4.1 ecologics: Ecologics 2.1.11 effective: Effective 1.4 enchantedshulkers: Enchanted Shulkers 1.0.4 enchdesc: EnchantmentDescriptions 13.0.14 enchlevel-langpatch: Enchantment Level Language Patch 2.0.2 entity_texture_features: Entity Texture Features 4.3.3 essential: Essential 13437+deploy-staging+gd2fc09762 essential-container: essential-container 1.0.0 everycomp: Every Compat 1.19.2-2.5.1 expandability: ExpandAbility 7.0.0 expandeddelight: Expanded Delight 0.2.5.2 expandedstorage: Expanded Storage 8.3.4 fabric-api: Fabric API 0.76.0+1.19.2 fabric-language-kotlin: Fabric Language Kotlin 1.9.5+kotlin.1.8.22 fabric-ofcapes: OF Capes 2.0.0 fabricloader: Fabric Loader 0.14.22 fallingleaves: Falling Leaves 1.13.0+1.19.2 farmersdelight: Farmer's Delight 1.19.2-1.3.10 farsight: Farsight Mod 1.19-2.4 faux-custom-entity-data: Faux-Custom-Entity-Data 2.0.2 ferritecore: FerriteCore 5.0.3 forgeconfigapiport: Forge Config API Port 4.2.11 frame: Frame 0.26.1+1.19-fabric friendsandfoes: Friends&Foes 1.8.2 galosphere: Galosphere 1.19.2-1.2.2 geckolib3: Geckolib 3.1.40 goodall: Goodall 1.2.0 handcrafted: Handcrafted 2.0.6 iceberg: Iceberg 1.0.46 immersive_weathering: Immersive Weathering 1.19.2-1.2.9 indium: Indium 1.0.9+mc1.19.2 infinitybuttons: Infinity Buttons 3.1.0-mc1.19.2 ingredient-extension-api: Ingredient Extension API 3.0.6 jade: Jade 8.7.3 jadeaddons: Jade Addons 3.2.0 java: OpenJDK 64-Bit Server VM 17 konkrete: Konkrete 1.6.1 kubejs: KubeJS 1902.6.0-build.142 lambdynlights: LambDynamicLights 2.2.0+1.19.2 langpatch-conf4: Enchantment Level Language Patch Config 4.0.0 lazydfu: LazyDFU 0.1.3 led: Light Emitting Diode 1.3.0 libipn: libIPN 3.0.2 lithium: Lithium 0.11.1 luggage: Luggage 1.19-1.5.2 mcwdoors: Macaw's Doors 1.0.8 mcwtrpdoors: Macaw's Trapdoors 1.1.1 mcwwindows: Macaw's Windows 2.1.2 meadow: Meadow 1.2.0 measurements: Measurements 1.3.1 megaparrot: Megaparrot 1.0.7-1.19.2 memoryusagescreen: Memory Usage Screen 1.6 minecraft: Minecraft 1.19.2 minegate-moreblocks: MineGate · MoreBlocks 1.2.5 modelfix: Model Gap Fix 1.8 modmenu: Mod Menu 4.1.2 moonlight: Moonlight 1.19.2-2.2.44 mousetweaks: Mouse Tweaks 2.22 mousewheelie: Mouse Wheelie 1.10.7+mc1.19.2 naturescompass: Nature's Compass 1.19.2-2.1.0-fabric nethersdelight: Nether's Delight 1.0.1 no_more_purple: No More Purple 1.0.1 notenoughcrashes: Not Enough Crashes 4.2.1+1.19.2 onsoulfire: On Soul Fire 1.19-2 org_jetbrains_annotations: annotations 13.0 owo: oωo 0.8.5+1.19 patchouli: Patchouli 1.19.2-77-FABRIC pathunderfencegates: Path Under Fence Gates 1.3.0 plushies: Plushie Mod 1.2 polymorph: Polymorph 0.46.1+1.19.2 ponderjs: PonderJS 1.1.11 puzzleslib: Puzzles Lib 4.4.0 quickshulker: Quick Shulker 1.3.9-1.19 railways: Create: Steam 'n' Rails 1.5.0+fabric-mc1.19.2-build.14 reeses-sodium-options: Reese's Sodium Options 1.4.9+mc1.19.2-build.67 resourcefullib: Resourceful Lib 1.1.24 respitecreators: Respite Creators 1.2.0 rhino: Rhino 1902.2.2-build.268 rottencreatures: Rotten Creatures 1.0.1 roughly_enough_loot_tables: Roughly Enough Loot Tables 1.19-1.0 roughly_enough_trades: Roughly Enough Trades 1.19-1.0 roughlyenoughitems: Roughly Enough Items 9.1.615 roughlyenoughprofessions: Roughly Enough Professions 1.1.4 roughlyenoughresources: Roughly Enough Resources 2.6.0 satin: Satin 1.9.0 showmeyourskin: Show Me Your Skin! 1.6.3+1.19.2 shulkerboxtooltip: Shulker Box Tooltip 3.2.2+1.19.2 skinlayers: 3d Skin Layers 1.5.2-mc1.19.1 skylorlib: SkyLib 1.5.1 sliceanddice: Create Slice & Dice 2.3.1 smoothboot: Smooth Boot 1.19-1.7.1 smoothchunk: Smooth chunk save Mod 1.19.1-2.0 sodium: Sodium 0.4.4+build.18 sodium-extra: Sodium Extra 0.4.16+mc1.19.2-build.90 starlight: Starlight 1.1.1+fabric.ae22326 statement: Statement 4.2.5+1.14.4-1.19.3 steel: Steel 1.3.1+1.19.2 strawstatues: Straw Statues 4.0.10 structory: Structory 1.0.1 supplementaries: Supplementaries 1.19.2-2.3.17 swampier_swamps: Swampier Swamps 1.19-1.2.1 terrablender: TerraBlender 2.0.1.136 thonkutil: ThonkUtil 2.15.4+1.19 tipthescales: Tip The Scales 6.0.10 toolstats: ToolStats 12.0.2 trade_cycling: Trade Cycling 1.19.2-1.0.5 tradingpost: Trading Post 4.2.0 transparent: Transparent 5.1.2 trinkets: Trinkets 3.4.2 twigs: Twigs 3.0.0 universalcraft: UniversalCraft 277 v_slab_compat: v_slab_compat 1.19.2-1.4 vigilance: Vigilance 284 villagersplus: Villagers Plus 1.9 vinery: Vinery 1.3.9 visuality: Visuality 0.5.6 voicechat: Simple Voice Chat 1.19.2-2.4.10 wiredredstone: Wired Redstone 0.4.19+1.19.2 wondrouswilds: Wondrous Wilds 1.19.2-1.1.6 worldedit: WorldEdit 7.2.12+6240-87f4ae1 xaerominimap: Xaero's Minimap 23.4.4 xaeroworldmap: Xaero's World Map 1.30.3 xercapaint: Joy of Painting fabric-1.19.2-1.0.2 xlpackets: XLPackets 1.19.2-4 yet-another-config-lib: YetAnotherConfigLib 2.2.0-for-1.19.2 ### Additional Context In the Mod Version drop down menu, 1.5.0 is not an option. I used that version of Railway to reproduce the bug, not 1.4.3. I have a download for the modpack here: https://www.mediafire.com/file/3k4fzf0syqhag00/Clockwork_S2_Modpack_v1.16.zip/file Create and Steam and Rails will both have to be updated to 1.5.1c and 1.5.0 respectively in that download however.
non_process
game crashes when using radial menu with steam rails fabric build describe the bug when opening the radial menu to select a bogey type when the cursor is hovered over any of the options in the menu the game crashes the game will not crash until a menu item is hovered over however reproduction steps hold a train casing and open the radial menu move mouse to hover over the menu expected result the game should not crash screenshots and videos no response crash report or log operating system windows mod version create mod version minecraft version modloader and version fabric other mods accurateblockplacement accurate block placement advdebug advancements debug applecrates apple crates appleskin appleskin architects palette architect s palette fabric architectury architectury attributefix attributefix audioplayer audioplayer bcc bettercompatibilitychecker build bclib bclib beaconoverhaul beacon overhaul beautify beautify fabric bee info more bee info beekeeperhut friends foes beekeeper hut betterend better end bettermounthud better mount hud betterpingdisplay better ping display betterthirdperson better third person bettertridents better tridents biomemakeover biome makeover block limit fix block limit fix fabric blockus blockus blur blur fabric bookshelf bookshelf brazier brazier brewinandchewin brewin and chewin fabric bwncr bad wither no cookie reloaded byg oh the biomes you ll go cccbridge cc c bridge fabric charm charm charmofundying charm of undying chefsdelight chef s delight fabric chipped chipped citresewn cit resewn cleancut cleancut fabric cloth config cloth config clumps clumps collective collective colorfulazaleas colorful azaleas comforts comforts computercraft cc restitched configured configured connectiblechains connectible chains connectivity connectivity mod controlling controlling for fabric cosmetic armor cosmetic armor crafttweaker crafttweaker crawl crawl create create c build create enchantment industry create enchantment industry create so create sandpaper overhaul fabric createaddition create crafts additions createbigcannons create big cannons nightly createdeco create deco creategoggles create goggles c creativecore creativecore fabric ctm connectedtexturesmod for fabric culturaldelights cultural delights fabric customizableelytra customizable elytra decorative blocks decorative blocks deeperdarker deeper and darker delightfulcreators delightful creators discontinuous beacon beams discontinuous beacon beams doapi lets do api dummmmmmy mmmmmmmmmmmm easyanvils easy anvils easymagic easy magic easyshulkerboxes easy shulker boxes ecologics ecologics effective effective enchantedshulkers enchanted shulkers enchdesc enchantmentdescriptions enchlevel langpatch enchantment level language patch entity texture features entity texture features essential essential deploy staging essential container essential container everycomp every compat expandability expandability expandeddelight expanded delight expandedstorage expanded storage fabric api fabric api fabric language kotlin fabric language kotlin kotlin fabric ofcapes of capes fabricloader fabric loader fallingleaves falling leaves farmersdelight farmer s delight farsight farsight mod faux custom entity data faux custom entity data ferritecore ferritecore forgeconfigapiport forge config api port frame frame fabric friendsandfoes friends foes galosphere galosphere geckolib goodall goodall handcrafted handcrafted iceberg iceberg immersive weathering immersive weathering indium indium infinitybuttons infinity buttons ingredient extension api ingredient extension api jade jade jadeaddons jade addons java openjdk bit server vm konkrete konkrete kubejs kubejs build lambdynlights lambdynamiclights langpatch enchantment level language patch config lazydfu lazydfu led light emitting diode libipn libipn lithium lithium luggage luggage mcwdoors macaw s doors mcwtrpdoors macaw s trapdoors mcwwindows macaw s windows meadow meadow measurements measurements megaparrot megaparrot memoryusagescreen memory usage screen minecraft minecraft minegate moreblocks minegate · moreblocks modelfix model gap fix modmenu mod menu moonlight moonlight mousetweaks mouse tweaks mousewheelie mouse wheelie naturescompass nature s compass fabric nethersdelight nether s delight no more purple no more purple notenoughcrashes not enough crashes onsoulfire on soul fire org jetbrains annotations annotations owo oωo patchouli patchouli fabric pathunderfencegates path under fence gates plushies plushie mod polymorph polymorph ponderjs ponderjs puzzleslib puzzles lib quickshulker quick shulker railways create steam n rails fabric build reeses sodium options reese s sodium options build resourcefullib resourceful lib respitecreators respite creators rhino rhino build rottencreatures rotten creatures roughly enough loot tables roughly enough loot tables roughly enough trades roughly enough trades roughlyenoughitems roughly enough items roughlyenoughprofessions roughly enough professions roughlyenoughresources roughly enough resources satin satin showmeyourskin show me your skin shulkerboxtooltip shulker box tooltip skinlayers skin layers skylorlib skylib sliceanddice create slice dice smoothboot smooth boot smoothchunk smooth chunk save mod sodium sodium build sodium extra sodium extra build starlight starlight fabric statement statement steel steel strawstatues straw statues structory structory supplementaries supplementaries swampier swamps swampier swamps terrablender terrablender thonkutil thonkutil tipthescales tip the scales toolstats toolstats trade cycling trade cycling tradingpost trading post transparent transparent trinkets trinkets twigs twigs universalcraft universalcraft v slab compat v slab compat vigilance vigilance villagersplus villagers plus vinery vinery visuality visuality voicechat simple voice chat wiredredstone wired redstone wondrouswilds wondrous wilds worldedit worldedit xaerominimap xaero s minimap xaeroworldmap xaero s world map xercapaint joy of painting fabric xlpackets xlpackets yet another config lib yetanotherconfiglib for additional context in the mod version drop down menu is not an option i used that version of railway to reproduce the bug not i have a download for the modpack here create and steam and rails will both have to be updated to and respectively in that download however
0
9,448
12,428,843,766
IssuesEvent
2020-05-25 07:14:34
ramiromachado/easyRESTToGQL
https://api.github.com/repos/ramiromachado/easyRESTToGQL
opened
Configure CI on github
Development process
As a project manager, I want to every commit to master run all the test developed so that I can be sure that the new code does not breaks any old tested code
1.0
Configure CI on github - As a project manager, I want to every commit to master run all the test developed so that I can be sure that the new code does not breaks any old tested code
process
configure ci on github as a project manager i want to every commit to master run all the test developed so that i can be sure that the new code does not breaks any old tested code
1
72,164
15,217,409,091
IssuesEvent
2021-02-17 16:34:47
wss-demo/WebGoat
https://api.github.com/repos/wss-demo/WebGoat
opened
CVE-2019-1010266 (Medium) detected in lodash-4.17.10.tgz
security vulnerability
## CVE-2019-1010266 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.10.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.10.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.10.tgz</a></p> <p>Path to dependency file: WebGoat/docs/package.json</p> <p>Path to vulnerable library: WebGoat/docs/node_modules/gulp-uglify/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - gulp-uglify-3.0.1.tgz (Root Library) - :x: **lodash-4.17.10.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wss-demo/WebGoat/commit/a5c426e635e25813df4ed419b0af2f0a10a9433d">a5c426e635e25813df4ed419b0af2f0a10a9433d</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11. <p>Publish Date: 2019-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266>CVE-2019-1010266</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p> <p>Release Date: 2019-07-17</p> <p>Fix Resolution: 4.17.11</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.10","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-uglify:3.0.1;lodash:4.17.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2019-1010266","vulnerabilityDetails":"lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-1010266 (Medium) detected in lodash-4.17.10.tgz - ## CVE-2019-1010266 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.10.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.10.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.10.tgz</a></p> <p>Path to dependency file: WebGoat/docs/package.json</p> <p>Path to vulnerable library: WebGoat/docs/node_modules/gulp-uglify/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - gulp-uglify-3.0.1.tgz (Root Library) - :x: **lodash-4.17.10.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wss-demo/WebGoat/commit/a5c426e635e25813df4ed419b0af2f0a10a9433d">a5c426e635e25813df4ed419b0af2f0a10a9433d</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11. <p>Publish Date: 2019-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266>CVE-2019-1010266</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p> <p>Release Date: 2019-07-17</p> <p>Fix Resolution: 4.17.11</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.10","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-uglify:3.0.1;lodash:4.17.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2019-1010266","vulnerabilityDetails":"lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in lodash tgz cve medium severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file webgoat docs package json path to vulnerable library webgoat docs node modules gulp uglify node modules lodash package json dependency hierarchy gulp uglify tgz root library x lodash tgz vulnerable library found in head commit a href found in base branch develop vulnerability details lodash prior to is affected by cwe uncontrolled resource consumption the impact is denial of service the component is date handler the attack vector is attacker provides very long strings which the library attempts to match using a regular expression the fixed version is publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp uglify lodash isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails lodash prior to is affected by cwe uncontrolled resource consumption the impact is denial of service the component is date handler the attack vector is attacker provides very long strings which the library attempts to match using a regular expression the fixed version is vulnerabilityurl
0
6,641
9,754,008,445
IssuesEvent
2019-06-04 10:29:13
EthVM/EthVM
https://api.github.com/repos/EthVM/EthVM
closed
Possible retry infinite loop on processing?
bug project:processing
* **I'm submitting a ...** - [x] bug report * **Bug Report** I've detected that in some circumstances it may happen a infinite loop while retrying. Can you take a look at these traces? https://pastebin.com/raw/yCT804hG In CloudWatch for Ropsten at around 21:40:37 of 03/06
1.0
Possible retry infinite loop on processing? - * **I'm submitting a ...** - [x] bug report * **Bug Report** I've detected that in some circumstances it may happen a infinite loop while retrying. Can you take a look at these traces? https://pastebin.com/raw/yCT804hG In CloudWatch for Ropsten at around 21:40:37 of 03/06
process
possible retry infinite loop on processing i m submitting a bug report bug report i ve detected that in some circumstances it may happen a infinite loop while retrying can you take a look at these traces in cloudwatch for ropsten at around of
1
5,675
8,557,437,587
IssuesEvent
2018-11-08 15:43:40
Jeffail/benthos
https://api.github.com/repos/Jeffail/benthos
closed
Add regex operator to the text processor
enhancement good first issue help wanted processors
It would be nice to have a text processor operator for reducing the document to a regular expression partial match, we currently already have a replace with regexp operator: https://github.com/Jeffail/benthos/tree/master/docs/processors#replace_regexp, but it would also be good to have an operator to simply return the matching section of the expression. I'm labeling this as a good first issue as it should be a simple matter of mostly copying the existing function from here: https://github.com/Jeffail/benthos/blob/master/lib/processor/text.go#L170
1.0
Add regex operator to the text processor - It would be nice to have a text processor operator for reducing the document to a regular expression partial match, we currently already have a replace with regexp operator: https://github.com/Jeffail/benthos/tree/master/docs/processors#replace_regexp, but it would also be good to have an operator to simply return the matching section of the expression. I'm labeling this as a good first issue as it should be a simple matter of mostly copying the existing function from here: https://github.com/Jeffail/benthos/blob/master/lib/processor/text.go#L170
process
add regex operator to the text processor it would be nice to have a text processor operator for reducing the document to a regular expression partial match we currently already have a replace with regexp operator but it would also be good to have an operator to simply return the matching section of the expression i m labeling this as a good first issue as it should be a simple matter of mostly copying the existing function from here
1
16,147
20,425,828,651
IssuesEvent
2022-02-24 03:40:36
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
opened
DISABLED test_terminate_signal (__main__.SpawnTest)
module: multiprocessing triaged module: flaky-tests skipped
Platforms: linux This test was disabled because it is failing in CI. See [recent examples](http://torch-ci.com/failure/test_terminate_signal%2C%20SpawnTest) and the most recent [workflow logs](https://github.com/pytorch/pytorch/actions/runs/1890592409). Over the past 3 hours, it has been determined flaky in 1 workflow(s) with 1 red and 3 green.
1.0
DISABLED test_terminate_signal (__main__.SpawnTest) - Platforms: linux This test was disabled because it is failing in CI. See [recent examples](http://torch-ci.com/failure/test_terminate_signal%2C%20SpawnTest) and the most recent [workflow logs](https://github.com/pytorch/pytorch/actions/runs/1890592409). Over the past 3 hours, it has been determined flaky in 1 workflow(s) with 1 red and 3 green.
process
disabled test terminate signal main spawntest platforms linux this test was disabled because it is failing in ci see and the most recent over the past hours it has been determined flaky in workflow s with red and green
1
18,355
24,483,965,595
IssuesEvent
2022-10-09 07:13:36
didi/mpx
https://api.github.com/repos/didi/mpx
closed
跨平台编译到抖音小程序,在js文件使用wx或者mpx 的api会报错
processing
**问题描述** 请用简洁的语言描述你遇到的bug,至少包括以下部分,如提供截图请尽量完整: 1. 创建一个js文件,js文件中使用wx.getStorageSync('string') 2. 获取到缓存中的值 3. 报错:__webpack_require__.n(...)(...).getStorageSync is not a function **环境信息描述** 至少包含以下部分: 1. 系统类型(Mac或者Windows):Windows 2. Mpx依赖版本(@mpxjs/core、@mpxjs/webpack-plugin和@mpxjs/api-proxy的具体版本,可以通过package-lock.json或者实际去node_modules当中查看):“@mpxjs/api-proxy": "^2.2.26" "@mpxjs/core": "^2.2.25" "@mpxjs/webpack-plugin": "^2.2.26", 3. 小程序开发者工具信息(小程序平台、开发者工具版本、基础库版本):字节跳动开发者工具、V3.3.7-1、2.70.0.5 **最简复现demo** 一般来说通过文字和截图的描述我们很难定位到问题,为了帮助我们快速定位问题并修复,请按照以下指南编写并上传最简复现demo: 1. 根据现有项目遇到的问题,尝试精简代码,确定问题的最小复现条件 2. 使用脚手架创建新项目,基于最小复现条件编写稳定的最简复现demo 3. 删除项目中的node_modules部分,打包项目,并拖拽到issue输入框中上传(或提供远程可下载地址) [test.zip](https://github.com/didi/mpx/files/9738783/test.zip)
1.0
跨平台编译到抖音小程序,在js文件使用wx或者mpx 的api会报错 - **问题描述** 请用简洁的语言描述你遇到的bug,至少包括以下部分,如提供截图请尽量完整: 1. 创建一个js文件,js文件中使用wx.getStorageSync('string') 2. 获取到缓存中的值 3. 报错:__webpack_require__.n(...)(...).getStorageSync is not a function **环境信息描述** 至少包含以下部分: 1. 系统类型(Mac或者Windows):Windows 2. Mpx依赖版本(@mpxjs/core、@mpxjs/webpack-plugin和@mpxjs/api-proxy的具体版本,可以通过package-lock.json或者实际去node_modules当中查看):“@mpxjs/api-proxy": "^2.2.26" "@mpxjs/core": "^2.2.25" "@mpxjs/webpack-plugin": "^2.2.26", 3. 小程序开发者工具信息(小程序平台、开发者工具版本、基础库版本):字节跳动开发者工具、V3.3.7-1、2.70.0.5 **最简复现demo** 一般来说通过文字和截图的描述我们很难定位到问题,为了帮助我们快速定位问题并修复,请按照以下指南编写并上传最简复现demo: 1. 根据现有项目遇到的问题,尝试精简代码,确定问题的最小复现条件 2. 使用脚手架创建新项目,基于最小复现条件编写稳定的最简复现demo 3. 删除项目中的node_modules部分,打包项目,并拖拽到issue输入框中上传(或提供远程可下载地址) [test.zip](https://github.com/didi/mpx/files/9738783/test.zip)
process
跨平台编译到抖音小程序,在js文件使用wx或者mpx 的api会报错 问题描述 请用简洁的语言描述你遇到的bug,至少包括以下部分,如提供截图请尽量完整: 创建一个js文件,js文件中使用wx getstoragesync string 获取到缓存中的值 报错: webpack require n getstoragesync is not a function 环境信息描述 至少包含以下部分: 系统类型 mac或者windows :windows mpx依赖版本 mpxjs core、 mpxjs webpack plugin和 mpxjs api proxy的具体版本,可以通过package lock json或者实际去node modules当中查看 :“ mpxjs api proxy mpxjs core mpxjs webpack plugin 小程序开发者工具信息 小程序平台、开发者工具版本、基础库版本):字节跳动开发者工具、 、 最简复现demo 一般来说通过文字和截图的描述我们很难定位到问题,为了帮助我们快速定位问题并修复,请按照以下指南编写并上传最简复现demo: 根据现有项目遇到的问题,尝试精简代码,确定问题的最小复现条件 使用脚手架创建新项目,基于最小复现条件编写稳定的最简复现demo 删除项目中的node modules部分,打包项目,并拖拽到issue输入框中上传(或提供远程可下载地址)
1
191,952
15,307,691,084
IssuesEvent
2021-02-24 21:16:59
inspirezonetech/JobSearchWebScraping
https://api.github.com/repos/inspirezonetech/JobSearchWebScraping
opened
Add issue template Current Behaviour -> Changes Requested
documentation good first issue
## Current Behaviour No template for "Current Behaviour" "Changes Requested" ## Changes Requested Add template to repo.
1.0
Add issue template Current Behaviour -> Changes Requested - ## Current Behaviour No template for "Current Behaviour" "Changes Requested" ## Changes Requested Add template to repo.
non_process
add issue template current behaviour changes requested current behaviour no template for current behaviour changes requested changes requested add template to repo
0
20,231
26,835,133,905
IssuesEvent
2023-02-02 18:52:35
alphagov/govuk-design-system
https://api.github.com/repos/alphagov/govuk-design-system
closed
Move the team sprint board to the new GitHub projects feature
🕔 days process
## What Github released [a new version of their projects feature](https://docs.github.com/en/issues/planning-and-tracking-with-projects/learning-about-projects/about-projects) earlier this year. We want to move out sprint board to this new version. At the same time, we should consider reviewing and simplifying our labelling system and whether to utilise milestones better. ## Why The iterated projects feature better meets our needs, for example, it has a built in analytics which can help us measure our sprints better. ## Who needs to work on this Kelly ## Who needs to review this The whole team ## Done when - [x] Demo the new board features with the team - [x] Agree most useful 'views' - [x] Review labelling system - [x] Review from team - [x] Migrate board
1.0
Move the team sprint board to the new GitHub projects feature - ## What Github released [a new version of their projects feature](https://docs.github.com/en/issues/planning-and-tracking-with-projects/learning-about-projects/about-projects) earlier this year. We want to move out sprint board to this new version. At the same time, we should consider reviewing and simplifying our labelling system and whether to utilise milestones better. ## Why The iterated projects feature better meets our needs, for example, it has a built in analytics which can help us measure our sprints better. ## Who needs to work on this Kelly ## Who needs to review this The whole team ## Done when - [x] Demo the new board features with the team - [x] Agree most useful 'views' - [x] Review labelling system - [x] Review from team - [x] Migrate board
process
move the team sprint board to the new github projects feature what github released earlier this year we want to move out sprint board to this new version at the same time we should consider reviewing and simplifying our labelling system and whether to utilise milestones better why the iterated projects feature better meets our needs for example it has a built in analytics which can help us measure our sprints better who needs to work on this kelly who needs to review this the whole team done when demo the new board features with the team agree most useful views review labelling system review from team migrate board
1
22,711
32,037,531,346
IssuesEvent
2023-09-22 16:28:30
medic/cht-core
https://api.github.com/repos/medic/cht-core
closed
Release 4.4.0
Type: Internal process
# Planning - Product Manager - [x] Create a GH Milestone for the release. We use [semver](http://semver.org) so if there are breaking changes increment the major, otherwise if there are new features increment the minor, otherwise increment the service pack. Breaking changes in our case relate to updated software requirements (egs: CouchDB, node, minimum browser versions), broken backwards compatibility in an api, or a major visual update that requires user retraining. - [x] Add all the issues to be worked on to the Milestone. Ideally each minor release will have one or two features, a handful of improvements, and plenty of bug fixes. - [x] Identify any features and improvements in the release that need end-user documentation (beyond eng team documentation improvements) and create corresponding issues in the cht-docs repo - [x] Assign an engineer as Release Engineer for this release. # Development - Release Engineer When development is ready to begin one of the engineers should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are completed though not necessarily completing them. - [x] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <major|minor>`. - [x] Raise a new issue called `Update dependencies for <version>` with a description that links to [the documentation](https://docs.communityhealthtoolkit.org/core/guides/update-dependencies/). This should be done early in the release cycle so find a volunteer to take this on and assign it to them. - [x] Ensure that issues from merged commits are closed and mapped to a milestone. - [ ] Write an update in the #product-team Slack channel summarising development and identifying any blockers (the [milestone-status](https://github.com/medic/support-scripts/tree/master/milestone-status) script can be used to get a breakdown of the issues). The Release Engineer is to update this every week until the version is released. # Releasing - Release Engineer Once all issues have been merged into `master` then the release process can start: - [x] Create a new release branch from `master` named `<major>.<minor>.x` in `cht-core`. Post a message to #development Slack channel using this template: ``` @core_devs I've just created the `<major>.<minor>.x` release branch. Please be aware that any further changes intended for this release will have to be merged to `master` then backported. Thanks! ``` - [x] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a lightweight git tag (e.g. `git tag <major>.<minor>.<patch>-beta.1`). - [x] Once the CI completes successfully notify the team by writing a message in the #product-team Slack channel: ``` @product_team, I’ve just created the `<major>.<minor>.<patch>-beta.1` tag. Please let me know if there’s any final update we need to make. If all is good, then in 24h, I will start the release. Thanks! ``` - [x] Add release notes to the [Core Framework Releases](https://docs.communityhealthtoolkit.org/core/releases/) page: - [x] Create a new document for the release in the [releases folder](https://github.com/medic/cht-docs/tree/main/content/en/core/releases). - [x] Ensure all issues are in the GH Milestone, that they're correctly labelled (in particular: they have the right Type, "UI/UX" if they change the UI, and "Breaking change" if appropriate), and have human readable descriptions. - [x] Use [this script](https://github.com/medic/cht-core/blob/master/scripts/release-notes) to export the issues into our release note format. - [x] Manually document any known migration steps and known issues. - [x] Provide description, screenshots, videos, and anything else to help communicate particularly important changes. - [x] Document any required or recommended upgrades to our other products (eg: cht-conf, cht-gateway, cht-android). - [x] Add the release to the [Supported versions](https://docs.communityhealthtoolkit.org/core/releases/#supported-versions) and update the EOL date and status of previous releases. Also add a link in the `Release Notes` section to the new release page. - [x] Assign the PR to: - The Director of Technology or a developer - An SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient - [x] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/cht-core/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Ensure the release notes PR above is merged. Link to the release notes in the description of the release. - [x] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds_4/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>` - [ ] Execute the scalability testing suite on the final build and download the scalability results on S3 at medic-e2e/scalability/$TAG_NAME. Add the release `.json` file to `cht-core/tests/scalability/previous_results`. More info in the [scalability documentation](https://github.com/medic/cht-core/blob/master/tests/scalability/README.md). - [x] Upgrade the `demo-cht.dev` instance to this version. - [x] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/c/product/releases/26), under the "Product - Releases" category using this template: ``` *We're excited to announce the release of {{version}} of {{product}}* New features include {{key_features}}. We've also implemented loads of other improvements and fixed a heap of bugs. Read the [release notes]({{url}}) for full details. Following our support policy, versions {{versions}} are no longer supported. Projects running these versions should start planning to upgrade in the near future. For more details read our [software support documentation](https://docs.communityhealthtoolkit.org/core/releases/#supported-versions). Check out our [roadmap](https://github.com/orgs/medic/projects/112) to see what we're working on next. ``` - [x] Add one last update to the #product-team Slack channel and use the thread to lead an internal release retrospective covering what went well and areas to improve for next time. - [x] Add any open "known issues" from the prior release that were not fixed in this release. Done by adding the correct `Affects: 4.x.x` label. - [x] Mark this issue "done" and close the Milestone.
1.0
Release 4.4.0 - # Planning - Product Manager - [x] Create a GH Milestone for the release. We use [semver](http://semver.org) so if there are breaking changes increment the major, otherwise if there are new features increment the minor, otherwise increment the service pack. Breaking changes in our case relate to updated software requirements (egs: CouchDB, node, minimum browser versions), broken backwards compatibility in an api, or a major visual update that requires user retraining. - [x] Add all the issues to be worked on to the Milestone. Ideally each minor release will have one or two features, a handful of improvements, and plenty of bug fixes. - [x] Identify any features and improvements in the release that need end-user documentation (beyond eng team documentation improvements) and create corresponding issues in the cht-docs repo - [x] Assign an engineer as Release Engineer for this release. # Development - Release Engineer When development is ready to begin one of the engineers should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are completed though not necessarily completing them. - [x] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <major|minor>`. - [x] Raise a new issue called `Update dependencies for <version>` with a description that links to [the documentation](https://docs.communityhealthtoolkit.org/core/guides/update-dependencies/). This should be done early in the release cycle so find a volunteer to take this on and assign it to them. - [x] Ensure that issues from merged commits are closed and mapped to a milestone. - [ ] Write an update in the #product-team Slack channel summarising development and identifying any blockers (the [milestone-status](https://github.com/medic/support-scripts/tree/master/milestone-status) script can be used to get a breakdown of the issues). The Release Engineer is to update this every week until the version is released. # Releasing - Release Engineer Once all issues have been merged into `master` then the release process can start: - [x] Create a new release branch from `master` named `<major>.<minor>.x` in `cht-core`. Post a message to #development Slack channel using this template: ``` @core_devs I've just created the `<major>.<minor>.x` release branch. Please be aware that any further changes intended for this release will have to be merged to `master` then backported. Thanks! ``` - [x] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a lightweight git tag (e.g. `git tag <major>.<minor>.<patch>-beta.1`). - [x] Once the CI completes successfully notify the team by writing a message in the #product-team Slack channel: ``` @product_team, I’ve just created the `<major>.<minor>.<patch>-beta.1` tag. Please let me know if there’s any final update we need to make. If all is good, then in 24h, I will start the release. Thanks! ``` - [x] Add release notes to the [Core Framework Releases](https://docs.communityhealthtoolkit.org/core/releases/) page: - [x] Create a new document for the release in the [releases folder](https://github.com/medic/cht-docs/tree/main/content/en/core/releases). - [x] Ensure all issues are in the GH Milestone, that they're correctly labelled (in particular: they have the right Type, "UI/UX" if they change the UI, and "Breaking change" if appropriate), and have human readable descriptions. - [x] Use [this script](https://github.com/medic/cht-core/blob/master/scripts/release-notes) to export the issues into our release note format. - [x] Manually document any known migration steps and known issues. - [x] Provide description, screenshots, videos, and anything else to help communicate particularly important changes. - [x] Document any required or recommended upgrades to our other products (eg: cht-conf, cht-gateway, cht-android). - [x] Add the release to the [Supported versions](https://docs.communityhealthtoolkit.org/core/releases/#supported-versions) and update the EOL date and status of previous releases. Also add a link in the `Release Notes` section to the new release page. - [x] Assign the PR to: - The Director of Technology or a developer - An SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient - [x] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/cht-core/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Ensure the release notes PR above is merged. Link to the release notes in the description of the release. - [x] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds_4/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>` - [ ] Execute the scalability testing suite on the final build and download the scalability results on S3 at medic-e2e/scalability/$TAG_NAME. Add the release `.json` file to `cht-core/tests/scalability/previous_results`. More info in the [scalability documentation](https://github.com/medic/cht-core/blob/master/tests/scalability/README.md). - [x] Upgrade the `demo-cht.dev` instance to this version. - [x] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/c/product/releases/26), under the "Product - Releases" category using this template: ``` *We're excited to announce the release of {{version}} of {{product}}* New features include {{key_features}}. We've also implemented loads of other improvements and fixed a heap of bugs. Read the [release notes]({{url}}) for full details. Following our support policy, versions {{versions}} are no longer supported. Projects running these versions should start planning to upgrade in the near future. For more details read our [software support documentation](https://docs.communityhealthtoolkit.org/core/releases/#supported-versions). Check out our [roadmap](https://github.com/orgs/medic/projects/112) to see what we're working on next. ``` - [x] Add one last update to the #product-team Slack channel and use the thread to lead an internal release retrospective covering what went well and areas to improve for next time. - [x] Add any open "known issues" from the prior release that were not fixed in this release. Done by adding the correct `Affects: 4.x.x` label. - [x] Mark this issue "done" and close the Milestone.
process
release planning product manager create a gh milestone for the release we use so if there are breaking changes increment the major otherwise if there are new features increment the minor otherwise increment the service pack breaking changes in our case relate to updated software requirements egs couchdb node minimum browser versions broken backwards compatibility in an api or a major visual update that requires user retraining add all the issues to be worked on to the milestone ideally each minor release will have one or two features a handful of improvements and plenty of bug fixes identify any features and improvements in the release that need end user documentation beyond eng team documentation improvements and create corresponding issues in the cht docs repo assign an engineer as release engineer for this release development release engineer when development is ready to begin one of the engineers should be nominated as a release engineer they will be responsible for making sure the following tasks are completed though not necessarily completing them set the version number in package json and package lock json and submit a pr the easiest way to do this is to use npm no git tag version version raise a new issue called update dependencies for with a description that links to this should be done early in the release cycle so find a volunteer to take this on and assign it to them ensure that issues from merged commits are closed and mapped to a milestone write an update in the product team slack channel summarising development and identifying any blockers the script can be used to get a breakdown of the issues the release engineer is to update this every week until the version is released releasing release engineer once all issues have been merged into master then the release process can start create a new release branch from master named x in cht core post a message to development slack channel using this template core devs i ve just created the x release branch please be aware that any further changes intended for this release will have to be merged to master then backported thanks build a beta named beta by pushing a lightweight git tag e g git tag beta once the ci completes successfully notify the team by writing a message in the product team slack channel product team i’ve just created the beta tag please let me know if there’s any final update we need to make if all is good then in i will start the release thanks add release notes to the page create a new document for the release in the ensure all issues are in the gh milestone that they re correctly labelled in particular they have the right type ui ux if they change the ui and breaking change if appropriate and have human readable descriptions use to export the issues into our release note format manually document any known migration steps and known issues provide description screenshots videos and anything else to help communicate particularly important changes document any required or recommended upgrades to our other products eg cht conf cht gateway cht android add the release to the and update the eol date and status of previous releases also add a link in the release notes section to the new release page assign the pr to the director of technology or a developer an sre to review and confirm the documentation on upgrade instructions and breaking changes is sufficient create a release in github from the release branch so it shows up under the with the naming convention this will create the git tag automatically ensure the release notes pr above is merged link to the release notes in the description of the release confirm the release build completes successfully and the new release is available on the make sure that the document has new entry with id medic medic execute the scalability testing suite on the final build and download the scalability results on at medic scalability tag name add the release json file to cht core tests scalability previous results more info in the upgrade the demo cht dev instance to this version announce the release on the under the product releases category using this template we re excited to announce the release of version of product new features include key features we ve also implemented loads of other improvements and fixed a heap of bugs read the url for full details following our support policy versions versions are no longer supported projects running these versions should start planning to upgrade in the near future for more details read our check out our to see what we re working on next add one last update to the product team slack channel and use the thread to lead an internal release retrospective covering what went well and areas to improve for next time add any open known issues from the prior release that were not fixed in this release done by adding the correct affects x x label mark this issue done and close the milestone
1
286,435
21,576,087,587
IssuesEvent
2022-05-02 13:53:39
SciML/ReservoirComputing.jl
https://api.github.com/repos/SciML/ReservoirComputing.jl
closed
Suggestion for figures: express time axis in Lyapunov time
documentation
Hey, awesome to see this in Julia! I remember first learning about this methods in a 2018/2019 meeting where Ed Ott presented their reservoir computing work for chaotic timeseries prediction, and we presented the work based on [TimeseriesPrediction.jl](https://github.com/JuliaDynamics/TimeseriesPrediction.jl), (now published here https://link.springer.com/article/10.1007/s00332-019-09588-7 ) In chaotic timeseries prediction it is customary to express the time axis in units of the Lyapunov time, `1/λ` with `λ` the Maximum Lyapunov exponent. For many famous systems (e.g. Lorenz63) you can find its value online, or use the `lyapunov` function from DynamicalSystems.jl, https://juliadynamics.github.io/DynamicalSystems.jl/dev/chaos/lyapunovs/
1.0
Suggestion for figures: express time axis in Lyapunov time - Hey, awesome to see this in Julia! I remember first learning about this methods in a 2018/2019 meeting where Ed Ott presented their reservoir computing work for chaotic timeseries prediction, and we presented the work based on [TimeseriesPrediction.jl](https://github.com/JuliaDynamics/TimeseriesPrediction.jl), (now published here https://link.springer.com/article/10.1007/s00332-019-09588-7 ) In chaotic timeseries prediction it is customary to express the time axis in units of the Lyapunov time, `1/λ` with `λ` the Maximum Lyapunov exponent. For many famous systems (e.g. Lorenz63) you can find its value online, or use the `lyapunov` function from DynamicalSystems.jl, https://juliadynamics.github.io/DynamicalSystems.jl/dev/chaos/lyapunovs/
non_process
suggestion for figures express time axis in lyapunov time hey awesome to see this in julia i remember first learning about this methods in a meeting where ed ott presented their reservoir computing work for chaotic timeseries prediction and we presented the work based on now published here in chaotic timeseries prediction it is customary to express the time axis in units of the lyapunov time λ with λ the maximum lyapunov exponent for many famous systems e g you can find its value online or use the lyapunov function from dynamicalsystems jl
0
19,119
25,170,220,690
IssuesEvent
2022-11-11 02:05:58
googleapis/nodejs-assured-workloads
https://api.github.com/repos/googleapis/nodejs-assured-workloads
closed
Add actual quickstart.js sample for library
type: process samples api: assuredworkloads
We should add an actual functional quickstart.js sample, that demonstrates how a user would operate this library. Refs: https://github.com/googleapis/nodejs-assured-workloads/pull/1
1.0
Add actual quickstart.js sample for library - We should add an actual functional quickstart.js sample, that demonstrates how a user would operate this library. Refs: https://github.com/googleapis/nodejs-assured-workloads/pull/1
process
add actual quickstart js sample for library we should add an actual functional quickstart js sample that demonstrates how a user would operate this library refs
1
14,746
10,210,769,079
IssuesEvent
2019-08-14 15:26:10
unipept/unipept
https://api.github.com/repos/unipept/unipept
closed
`unipept taxonomy` removes duplicates
/services/api migrated
When querying the taxonomy with duplicate IDs, only one result per ID is returned. For example: http://api.unipept.ugent.be/api/v1/taxonomy.json?input[]=817&input[]=817. Pept2prot doesn't do this as can be seen in http://api.unipept.ugent.be/api/v1/pept2prot.json?input[]=AIPQLEVARPADAYETAEAYR&input[]=APVLSDSSCK&input[]=AIPQLEVARPADAYETAEAYR. https://github.ugent.be/unipept/unipept/blob/develop/app/controllers/api/api_controller.rb#L141 _[Original issue](https://github.ugent.be/unipept/unipept/issues/490) by @silox on Tue Apr 14 2015 at 11:17._ _Closed by @bmesuere on Tue May 19 2015 at 21:01._
1.0
`unipept taxonomy` removes duplicates - When querying the taxonomy with duplicate IDs, only one result per ID is returned. For example: http://api.unipept.ugent.be/api/v1/taxonomy.json?input[]=817&input[]=817. Pept2prot doesn't do this as can be seen in http://api.unipept.ugent.be/api/v1/pept2prot.json?input[]=AIPQLEVARPADAYETAEAYR&input[]=APVLSDSSCK&input[]=AIPQLEVARPADAYETAEAYR. https://github.ugent.be/unipept/unipept/blob/develop/app/controllers/api/api_controller.rb#L141 _[Original issue](https://github.ugent.be/unipept/unipept/issues/490) by @silox on Tue Apr 14 2015 at 11:17._ _Closed by @bmesuere on Tue May 19 2015 at 21:01._
non_process
unipept taxonomy removes duplicates when querying the taxonomy with duplicate ids only one result per id is returned for example input doesn t do this as can be seen in aipqlevarpadayetaeayr input apvlsdssck input aipqlevarpadayetaeayr by silox on tue apr at closed by bmesuere on tue may at
0
2,910
5,897,084,939
IssuesEvent
2017-05-18 11:31:14
rogerthat-platform/rogerthat-android-client
https://api.github.com/repos/rogerthat-platform/rogerthat-android-client
closed
Failed to save settings (com.mobicage.to.system.SettingsTO)
priority_critical process_duplicate
description: Bug! Error during response processing for 997e7ab2-fb1b-4bf7-aa90-38540ab61351 errorMessage: java.lang.NullPointerException: Attempt to invoke virtual method 'boolean java.lang.Boolean.booleanValue()' on a null object reference at com.mobicage.to.system.SettingsTO.<init>(SettingsTO.java:180) at com.mobicage.to.system.SaveSettingsResponse.<init>(SaveSettingsResponse.java:37) at com.mobicage.rpc.Parser.ComMobicageToSystemSaveSettingsResponse(Parser.java:2525) at com.mobicage.rpc.ResponseReceiverHandler.handle(ResponseReceiverHandler.java:716) at com.mobicage.rpc.http.HttpProtocol.process(HttpProtocol.java:253) at com.mobicage.rpc.http.HttpProtocol.processIncomingMessagesString(HttpProtocol.java:175) at com.mobicage.rpc.http.HttpCommunicator$9$1.safeRun(HttpCommunicator.java:688) at com.mobicage.rogerthat.util.system.SafeRunnable.run(SafeRunnable.java:50) at android.os.Handler.handleCallback(Handler.java:815) at android.os.Handler.dispatchMessage(Handler.java:104) at android.os.Looper.loop(Looper.java:207) at android.os.HandlerThread.run(HandlerThread.java:61) mobicageVersion: 0.2840 occurenceCount: 263 platform: 1 platformVersion: acer/b3-a30_ww_gen1/acer_jetfirehd:6.0/MRA58K/1487768832:user/release-keys (-) 23 (-) B3-A30
1.0
Failed to save settings (com.mobicage.to.system.SettingsTO) - description: Bug! Error during response processing for 997e7ab2-fb1b-4bf7-aa90-38540ab61351 errorMessage: java.lang.NullPointerException: Attempt to invoke virtual method 'boolean java.lang.Boolean.booleanValue()' on a null object reference at com.mobicage.to.system.SettingsTO.<init>(SettingsTO.java:180) at com.mobicage.to.system.SaveSettingsResponse.<init>(SaveSettingsResponse.java:37) at com.mobicage.rpc.Parser.ComMobicageToSystemSaveSettingsResponse(Parser.java:2525) at com.mobicage.rpc.ResponseReceiverHandler.handle(ResponseReceiverHandler.java:716) at com.mobicage.rpc.http.HttpProtocol.process(HttpProtocol.java:253) at com.mobicage.rpc.http.HttpProtocol.processIncomingMessagesString(HttpProtocol.java:175) at com.mobicage.rpc.http.HttpCommunicator$9$1.safeRun(HttpCommunicator.java:688) at com.mobicage.rogerthat.util.system.SafeRunnable.run(SafeRunnable.java:50) at android.os.Handler.handleCallback(Handler.java:815) at android.os.Handler.dispatchMessage(Handler.java:104) at android.os.Looper.loop(Looper.java:207) at android.os.HandlerThread.run(HandlerThread.java:61) mobicageVersion: 0.2840 occurenceCount: 263 platform: 1 platformVersion: acer/b3-a30_ww_gen1/acer_jetfirehd:6.0/MRA58K/1487768832:user/release-keys (-) 23 (-) B3-A30
process
failed to save settings com mobicage to system settingsto description bug error during response processing for errormessage java lang nullpointerexception attempt to invoke virtual method boolean java lang boolean booleanvalue on a null object reference at com mobicage to system settingsto settingsto java at com mobicage to system savesettingsresponse savesettingsresponse java at com mobicage rpc parser commobicagetosystemsavesettingsresponse parser java at com mobicage rpc responsereceiverhandler handle responsereceiverhandler java at com mobicage rpc http httpprotocol process httpprotocol java at com mobicage rpc http httpprotocol processincomingmessagesstring httpprotocol java at com mobicage rpc http httpcommunicator saferun httpcommunicator java at com mobicage rogerthat util system saferunnable run saferunnable java at android os handler handlecallback handler java at android os handler dispatchmessage handler java at android os looper loop looper java at android os handlerthread run handlerthread java mobicageversion occurencecount platform platformversion acer ww acer jetfirehd user release keys
1
58,657
24,516,766,516
IssuesEvent
2022-10-11 06:07:02
icon-project/icon-bridge
https://api.github.com/repos/icon-project/icon-bridge
opened
Missing Blacklist Public Methods in near BTS
type: bug team: hugobyte module: service handler
## Overview Methods are private for Getting Blacklisted users and Checking users blacklisted in BTS ## Steps to Reproduce Steps to reproduce the behaviour: 1. In near cli 2. Run the query call to the bts contract 3. With methods `get_blacklisted_user` and `is_user_blacklisted` 4. Method not found error will be thrown ## Expected Behavior When we query `get_blacklisted_user` and `is_user_blacklisted` these two methods we should get the blacklisted user and a boolean value to check the user blacklisted ## Screenshots ![MicrosoftTeams-image](https://user-images.githubusercontent.com/35568964/195009234-7769ce02-1176-4117-8250-e72091914e4e.png) ![MicrosoftTeams-image (2)](https://user-images.githubusercontent.com/35568964/195009255-26c458a4-658c-40d0-84a1-5970c0074b87.png) ## Device Information **Desktop (please complete the following information):** - OS: [e.g. iOS] - Browser [e.g. chrome, safari] - Version [e.g. 22] **Smartphone (please complete the following information):** - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] ## Additional Context Add any other context about the problem here.
1.0
Missing Blacklist Public Methods in near BTS - ## Overview Methods are private for Getting Blacklisted users and Checking users blacklisted in BTS ## Steps to Reproduce Steps to reproduce the behaviour: 1. In near cli 2. Run the query call to the bts contract 3. With methods `get_blacklisted_user` and `is_user_blacklisted` 4. Method not found error will be thrown ## Expected Behavior When we query `get_blacklisted_user` and `is_user_blacklisted` these two methods we should get the blacklisted user and a boolean value to check the user blacklisted ## Screenshots ![MicrosoftTeams-image](https://user-images.githubusercontent.com/35568964/195009234-7769ce02-1176-4117-8250-e72091914e4e.png) ![MicrosoftTeams-image (2)](https://user-images.githubusercontent.com/35568964/195009255-26c458a4-658c-40d0-84a1-5970c0074b87.png) ## Device Information **Desktop (please complete the following information):** - OS: [e.g. iOS] - Browser [e.g. chrome, safari] - Version [e.g. 22] **Smartphone (please complete the following information):** - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] ## Additional Context Add any other context about the problem here.
non_process
missing blacklist public methods in near bts overview methods are private for getting blacklisted users and checking users blacklisted in bts steps to reproduce steps to reproduce the behaviour in near cli run the query call to the bts contract with methods get blacklisted user and is user blacklisted method not found error will be thrown expected behavior when we query get blacklisted user and is user blacklisted these two methods we should get the blacklisted user and a boolean value to check the user blacklisted screenshots device information desktop please complete the following information os browser version smartphone please complete the following information device os browser version additional context add any other context about the problem here
0
4,033
2,713,704,320
IssuesEvent
2015-04-09 20:52:07
mozilla/teach.webmaker.org
https://api.github.com/repos/mozilla/teach.webmaker.org
closed
TEACH LIKE MOZILLA: Listen Icon - fix line weight
design platform QA
Clean up the listen icon to unify the line width to match "say hello" icon ![screen shot 2015-03-31 at 6 16 13 pm](https://cloud.githubusercontent.com/assets/535012/6930535/2229d09e-d7d2-11e4-9f75-30f071f918de.png)
1.0
TEACH LIKE MOZILLA: Listen Icon - fix line weight - Clean up the listen icon to unify the line width to match "say hello" icon ![screen shot 2015-03-31 at 6 16 13 pm](https://cloud.githubusercontent.com/assets/535012/6930535/2229d09e-d7d2-11e4-9f75-30f071f918de.png)
non_process
teach like mozilla listen icon fix line weight clean up the listen icon to unify the line width to match say hello icon
0
355,726
25,176,008,287
IssuesEvent
2022-11-11 09:19:33
albertarielw/pe
https://api.github.com/repos/albertarielw/pe
opened
Missing important term on DG Glossary
severity.Low type.DocumentationBug
Expected: Important terms such as Client should be explained (especially since Property is explained). Actual: Such terms are missing Screenshots: ![image.png](https://raw.githubusercontent.com/albertarielw/pe/main/files/3354a036-8a98-4ea2-85dd-80937ee137bc.png) <!--session: 1668153166582-af5d4d05-46a9-4d9c-8d93-f2bf21b163d6--> <!--Version: Web v3.4.4-->
1.0
Missing important term on DG Glossary - Expected: Important terms such as Client should be explained (especially since Property is explained). Actual: Such terms are missing Screenshots: ![image.png](https://raw.githubusercontent.com/albertarielw/pe/main/files/3354a036-8a98-4ea2-85dd-80937ee137bc.png) <!--session: 1668153166582-af5d4d05-46a9-4d9c-8d93-f2bf21b163d6--> <!--Version: Web v3.4.4-->
non_process
missing important term on dg glossary expected important terms such as client should be explained especially since property is explained actual such terms are missing screenshots
0
53,796
6,344,491,917
IssuesEvent
2017-07-27 20:02:47
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Need to add test for ClientWebSocketOptions.SetBuffer() desktop implementation
area-System.Net test enhancement
In ClientWebSocketOptions.SetBuffer(), we have validations for parameters passed in. .Net Core has this implementation: https://github.com/dotnet/corefx/blob/master/src/System.Net.WebSockets.Client/src/System/Net/WebSockets/ClientWebSocketOptions.cs#L168 While .Net Framework's implementation is different: https://github.com/Microsoft/referencesource/blob/master/System/net/System/Net/WebSockets/WebSocketHelpers.cs#L471 Which causes failure for ClientWebSocketOptionsTests: https://github.com/dotnet/corefx/blob/master/src/System.Net.WebSockets.Client/tests/ClientWebSocketOptionsTests.cs#L37 Need to test desktop separately.
1.0
Need to add test for ClientWebSocketOptions.SetBuffer() desktop implementation - In ClientWebSocketOptions.SetBuffer(), we have validations for parameters passed in. .Net Core has this implementation: https://github.com/dotnet/corefx/blob/master/src/System.Net.WebSockets.Client/src/System/Net/WebSockets/ClientWebSocketOptions.cs#L168 While .Net Framework's implementation is different: https://github.com/Microsoft/referencesource/blob/master/System/net/System/Net/WebSockets/WebSocketHelpers.cs#L471 Which causes failure for ClientWebSocketOptionsTests: https://github.com/dotnet/corefx/blob/master/src/System.Net.WebSockets.Client/tests/ClientWebSocketOptionsTests.cs#L37 Need to test desktop separately.
non_process
need to add test for clientwebsocketoptions setbuffer desktop implementation in clientwebsocketoptions setbuffer we have validations for parameters passed in net core has this implementation while net framework s implementation is different which causes failure for clientwebsocketoptionstests need to test desktop separately
0
18,962
24,924,538,849
IssuesEvent
2022-10-31 05:44:46
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
reopened
[iOS] Text choice > Text choice with other option > Selected 'Other' option is not getting retained in the following scenario
Bug P1 iOS Process: Reopened Deferred
**Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study 3. Click on text choice question with other option 4. Select 'Other' option for text choice response type 5. Click on Cancel 6. Click on 'Save for later' 7. Again Click on activity and observe **AR:** Selected value is not getting retained **ER:** Selected value should get retained **Note:** The issue should also be fixed in the below scenario **Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study 3. Click on text choice question with other option 4. Select the 'Other' option for text choice response type 5. Click on Next button 6. Click on the Back arrow and observe
1.0
[iOS] Text choice > Text choice with other option > Selected 'Other' option is not getting retained in the following scenario - **Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study 3. Click on text choice question with other option 4. Select 'Other' option for text choice response type 5. Click on Cancel 6. Click on 'Save for later' 7. Again Click on activity and observe **AR:** Selected value is not getting retained **ER:** Selected value should get retained **Note:** The issue should also be fixed in the below scenario **Steps:** 1. Sign up or sign in to the mobile app 2. Enroll to the study 3. Click on text choice question with other option 4. Select the 'Other' option for text choice response type 5. Click on Next button 6. Click on the Back arrow and observe
process
text choice text choice with other option selected other option is not getting retained in the following scenario steps sign up or sign in to the mobile app enroll to the study click on text choice question with other option select other option for text choice response type click on cancel click on save for later again click on activity and observe ar selected value is not getting retained er selected value should get retained note the issue should also be fixed in the below scenario steps sign up or sign in to the mobile app enroll to the study click on text choice question with other option select the other option for text choice response type click on next button click on the back arrow and observe
1
4,301
7,195,309,030
IssuesEvent
2018-02-04 15:52:58
nodejs/node
https://api.github.com/repos/nodejs/node
closed
shell option for execFile and execFileSync in child_process module not documented
child_process doc good first issue
* **Version**: 8.9.4 * **Platform**: n/a * **Subsystem**: child_process The shell option for the execFile and execFileSync functions in the child_process module is not documented. However, it clearly works, as this mocha/chai test proves: ```js const { execFile, execFileSync } = require('child_process') const expect = require('chai').expect const EXEC_OPTS = { encoding: 'utf8', shell: process.env.SHELL } describe('execFile()', () => { it('should run command using specified shell', async () => { let fn try { const result = await new Promise((resolve, reject) => { execFile('ls', ['*'], EXEC_OPTS, (err, stdout) => err ? reject(err) : resolve(stdout)) }) fn = () => result } catch (err) { fn = () => { throw err } } expect(fn).to.not.throw() }) }) describe('execFileSync()', () => { it('should run command using specified shell', () => { expect(() => execFileSync('ls', ['*'], EXEC_OPTS)).to.not.throw() }) }) ``` I'm happy to add documentation for this option. I just need to know if this behavior is intention or whether it's the result of leaky internals.
1.0
shell option for execFile and execFileSync in child_process module not documented - * **Version**: 8.9.4 * **Platform**: n/a * **Subsystem**: child_process The shell option for the execFile and execFileSync functions in the child_process module is not documented. However, it clearly works, as this mocha/chai test proves: ```js const { execFile, execFileSync } = require('child_process') const expect = require('chai').expect const EXEC_OPTS = { encoding: 'utf8', shell: process.env.SHELL } describe('execFile()', () => { it('should run command using specified shell', async () => { let fn try { const result = await new Promise((resolve, reject) => { execFile('ls', ['*'], EXEC_OPTS, (err, stdout) => err ? reject(err) : resolve(stdout)) }) fn = () => result } catch (err) { fn = () => { throw err } } expect(fn).to.not.throw() }) }) describe('execFileSync()', () => { it('should run command using specified shell', () => { expect(() => execFileSync('ls', ['*'], EXEC_OPTS)).to.not.throw() }) }) ``` I'm happy to add documentation for this option. I just need to know if this behavior is intention or whether it's the result of leaky internals.
process
shell option for execfile and execfilesync in child process module not documented version platform n a subsystem child process the shell option for the execfile and execfilesync functions in the child process module is not documented however it clearly works as this mocha chai test proves js const execfile execfilesync require child process const expect require chai expect const exec opts encoding shell process env shell describe execfile it should run command using specified shell async let fn try const result await new promise resolve reject execfile ls exec opts err stdout err reject err resolve stdout fn result catch err fn throw err expect fn to not throw describe execfilesync it should run command using specified shell expect execfilesync ls exec opts to not throw i m happy to add documentation for this option i just need to know if this behavior is intention or whether it s the result of leaky internals
1
11,570
14,441,672,343
IssuesEvent
2020-12-07 17:05:32
frontendbr/forum
https://api.github.com/repos/frontendbr/forum
closed
Documentação e padrões de componentes / design
Processos [Discussão]
Pessoal, gostaria de saber como e onde vocês fazem uma documentação de tudo referente ao design de componentes e explicações de quando usar cada elemento dentro do produto em que vocês atuam. Estou com um problema hoje onde trabalho, pois existe uma página na internet onde existem alguns componentes que o sistema usa, porém está muito desatualizado. Uma designer fez um documento em PDF onde falava sobre os componentes, adicionando alguns novos também, mas não sei se documento escrito assim seria o ideal. Outra coisa também que gera bastante dor de cabeça é desenvolver componentes "fora do padrão". Pois como os documentos são antigos, quando desenvolvemos um módulo novo como estamos fazendo agora, em alguns casos específicos, o comportamento das novas telas não é igual ao de telas de módulos antigos já do sistema, o que acaba gerando retornos por estar "fora do padrão". Se puderem me indicar algumas soluções que vocês utilizam para fazer essas documentações de estilo e componentes de forma fácil... Desde já agradeço!
1.0
Documentação e padrões de componentes / design - Pessoal, gostaria de saber como e onde vocês fazem uma documentação de tudo referente ao design de componentes e explicações de quando usar cada elemento dentro do produto em que vocês atuam. Estou com um problema hoje onde trabalho, pois existe uma página na internet onde existem alguns componentes que o sistema usa, porém está muito desatualizado. Uma designer fez um documento em PDF onde falava sobre os componentes, adicionando alguns novos também, mas não sei se documento escrito assim seria o ideal. Outra coisa também que gera bastante dor de cabeça é desenvolver componentes "fora do padrão". Pois como os documentos são antigos, quando desenvolvemos um módulo novo como estamos fazendo agora, em alguns casos específicos, o comportamento das novas telas não é igual ao de telas de módulos antigos já do sistema, o que acaba gerando retornos por estar "fora do padrão". Se puderem me indicar algumas soluções que vocês utilizam para fazer essas documentações de estilo e componentes de forma fácil... Desde já agradeço!
process
documentação e padrões de componentes design pessoal gostaria de saber como e onde vocês fazem uma documentação de tudo referente ao design de componentes e explicações de quando usar cada elemento dentro do produto em que vocês atuam estou com um problema hoje onde trabalho pois existe uma página na internet onde existem alguns componentes que o sistema usa porém está muito desatualizado uma designer fez um documento em pdf onde falava sobre os componentes adicionando alguns novos também mas não sei se documento escrito assim seria o ideal outra coisa também que gera bastante dor de cabeça é desenvolver componentes fora do padrão pois como os documentos são antigos quando desenvolvemos um módulo novo como estamos fazendo agora em alguns casos específicos o comportamento das novas telas não é igual ao de telas de módulos antigos já do sistema o que acaba gerando retornos por estar fora do padrão se puderem me indicar algumas soluções que vocês utilizam para fazer essas documentações de estilo e componentes de forma fácil desde já agradeço
1
11,703
14,545,142,763
IssuesEvent
2020-12-15 19:12:44
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
How To Specify Ordering of Stages / Jobs
Pri2 devops-cicd-process/tech devops/prod support-request
When multiple stages / jobs are configured to run in parallel (`dependsOn: []`), their ordering in the ADO UI is indeterministic. Is there a way to order stages / jobs? neither ordering them in the YAML template nor alphabetizing seems to work. When there are later stages dependent on these jobs, the lines end up getting crossed and it becomes difficult to determine what is dependent on what in the UI --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: d322215c-8025-4f21-0700-7dfa7dc5c46e * Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4 * Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
How To Specify Ordering of Stages / Jobs - When multiple stages / jobs are configured to run in parallel (`dependsOn: []`), their ordering in the ADO UI is indeterministic. Is there a way to order stages / jobs? neither ordering them in the YAML template nor alphabetizing seems to work. When there are later stages dependent on these jobs, the lines end up getting crossed and it becomes difficult to determine what is dependent on what in the UI --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: d322215c-8025-4f21-0700-7dfa7dc5c46e * Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4 * Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
how to specify ordering of stages jobs when multiple stages jobs are configured to run in parallel dependson their ordering in the ado ui is indeterministic is there a way to order stages jobs neither ordering them in the yaml template nor alphabetizing seems to work when there are later stages dependent on these jobs the lines end up getting crossed and it becomes difficult to determine what is dependent on what in the ui document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
95,161
16,068,481,544
IssuesEvent
2021-04-24 00:46:51
snowflakedb/snowflake-hive-metastore-connector
https://api.github.com/repos/snowflakedb/snowflake-hive-metastore-connector
opened
CVE-2020-24750 (High) detected in jackson-databind-2.6.5.jar
security vulnerability
## CVE-2020-24750 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: snowflake-hive-metastore-connector/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.5/jackson-databind-2.6.5.jar</p> <p> Dependency Hierarchy: - hive-metastore-2.3.5.jar (Root Library) - hive-serde-2.3.5.jar - hive-common-2.3.5.jar - :x: **jackson-databind-2.6.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/snowflakedb/snowflake-hive-metastore-connector/commit/37f5b0ac91898ef82cc1bf4610b729970f6eed58">37f5b0ac91898ef82cc1bf4610b729970f6eed58</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to com.pastdev.httpcomponents.configuration.JndiConfiguration. <p>Publish Date: 2020-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24750>CVE-2020-24750</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616</a></p> <p>Release Date: 2020-08-28</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.6</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.5","packageFilePaths":["/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.hive:hive-metastore:2.3.5;org.apache.hive:hive-serde:2.3.5;org.apache.hive:hive-common:2.3.5;com.fasterxml.jackson.core:jackson-databind:2.6.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.6"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-24750","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to com.pastdev.httpcomponents.configuration.JndiConfiguration.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24750","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-24750 (High) detected in jackson-databind-2.6.5.jar - ## CVE-2020-24750 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: snowflake-hive-metastore-connector/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.5/jackson-databind-2.6.5.jar</p> <p> Dependency Hierarchy: - hive-metastore-2.3.5.jar (Root Library) - hive-serde-2.3.5.jar - hive-common-2.3.5.jar - :x: **jackson-databind-2.6.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/snowflakedb/snowflake-hive-metastore-connector/commit/37f5b0ac91898ef82cc1bf4610b729970f6eed58">37f5b0ac91898ef82cc1bf4610b729970f6eed58</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to com.pastdev.httpcomponents.configuration.JndiConfiguration. <p>Publish Date: 2020-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24750>CVE-2020-24750</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-24616</a></p> <p>Release Date: 2020-08-28</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.6</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.5","packageFilePaths":["/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.hive:hive-metastore:2.3.5;org.apache.hive:hive-serde:2.3.5;org.apache.hive:hive-common:2.3.5;com.fasterxml.jackson.core:jackson-databind:2.6.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.6"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-24750","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.6 mishandles the interaction between serialization gadgets and typing, related to com.pastdev.httpcomponents.configuration.JndiConfiguration.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24750","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file snowflake hive metastore connector pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy hive metastore jar root library hive serde jar hive common jar x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com pastdev httpcomponents configuration jndiconfiguration publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org apache hive hive metastore org apache hive hive serde org apache hive hive common com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com pastdev httpcomponents configuration jndiconfiguration vulnerabilityurl
0
40,815
5,316,973,096
IssuesEvent
2017-02-13 21:18:27
jupyterhub/jupyterhub
https://api.github.com/repos/jupyterhub/jupyterhub
closed
Possible problems still with GitHub user ids
help wanted needs: testing
Hi, I am running jupyterhub with the default jupyterhub-deploy scripts (the ansible ones). Here is the config file: https://github.com/jupyterhub/jupyterhub-deploy-teaching/blob/master/roles/jupyterhub/templates/jupyterhub_config.py.j2 I am still running into problems with GitHub ids that have some uppercase chars and other chars such as `-`. Not sure if this is a config issue, or otherwise. Looks like we are lowercasing the ids, but GitHub may not be consistent in how they handle user id casing. Would be good to do some actual tests with crazy GtiHub ids. @willingc
1.0
Possible problems still with GitHub user ids - Hi, I am running jupyterhub with the default jupyterhub-deploy scripts (the ansible ones). Here is the config file: https://github.com/jupyterhub/jupyterhub-deploy-teaching/blob/master/roles/jupyterhub/templates/jupyterhub_config.py.j2 I am still running into problems with GitHub ids that have some uppercase chars and other chars such as `-`. Not sure if this is a config issue, or otherwise. Looks like we are lowercasing the ids, but GitHub may not be consistent in how they handle user id casing. Would be good to do some actual tests with crazy GtiHub ids. @willingc
non_process
possible problems still with github user ids hi i am running jupyterhub with the default jupyterhub deploy scripts the ansible ones here is the config file i am still running into problems with github ids that have some uppercase chars and other chars such as not sure if this is a config issue or otherwise looks like we are lowercasing the ids but github may not be consistent in how they handle user id casing would be good to do some actual tests with crazy gtihub ids willingc
0
10,817
13,609,291,295
IssuesEvent
2020-09-23 04:50:43
googleapis/java-monitoring-dashboards
https://api.github.com/repos/googleapis/java-monitoring-dashboards
closed
Dependency Dashboard
api: monitoring type: process
This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-monitoring-dashboard-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-monitoring-dashboard to v0.2.1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-monitoring-dashboard-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-monitoring-dashboard to v1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10 - [ ] <!-- rebase-all-open-prs -->**Check this option to rebase all the above open PRs at once** --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses. ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-monitoring-dashboard-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-monitoring-dashboard to v0.2.1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-monitoring-dashboard-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-monitoring-dashboard to v1 - [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10 - [ ] <!-- rebase-all-open-prs -->**Check this option to rebase all the above open PRs at once** --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any chore deps update dependency com google cloud google cloud monitoring dashboard to chore deps update dependency com google cloud google cloud monitoring dashboard to chore deps update dependency com google cloud libraries bom to check this option to rebase all the above open prs at once check this box to trigger a request for renovate to run again on this repository
1
13,725
16,487,424,312
IssuesEvent
2021-05-24 20:14:58
darktable-org/darktable
https://api.github.com/repos/darktable-org/darktable
closed
const type qualifiers for image2d_t breaks OpenCL kernel compilation
priority: high scope: hardware support scope: image processing
I just installed the latest AMDGPU Linux drivers (21.10). Running darktable-cltest, I found that the OpenCL compilation was failing (the Intel IGC however managed to compile all kernels successfully). The issue was present even when I downgraded to an older version of AMDGPU (20.45). It failed with this error: ``` 0.407486 [opencl_load_program] could not load cached binary program, trying to compile source 0.407508 [opencl_load_program] successfully loaded program from '/usr/share/darktable/kernels/basic.cl' MD5: '23176e22ddb4db27410265238c646fbf' 0.452182 [opencl_build_program] could not build program: -11 0.452194 [opencl_build_program] BUILD STATUS: -2 0.452204 BUILD LOG: 0.452207 "/usr/share/darktable/kernels/color_conversion.h", line 69: error: a type qualifier is not allowed inline float lookup(read_only const image2d_t lut, const float x) ^ ``` Investigating further, I found that this is in accordance with the OpenCL specifications. According to https://www.khronos.org/registry/OpenCL/specs/3.0-unified/html/OpenCL_C.html#restrictions: > The type qualifiers const, restrict and volatile as defined by the C99 specification are supported. These qualifiers cannot be used with image2d_t, image3d_t, image2d_array_t, image2d_depth_t, image2d_array_depth_t, image1d_t, image1d_buffer_t and image1d_array_t types. Apparently, `image2d_t` cannot be of type `const`. Replacing all instances of `const image2d_t` with `image2d_t` in the OpenCL kernels fixed all the errors during kernel compilation.
1.0
const type qualifiers for image2d_t breaks OpenCL kernel compilation - I just installed the latest AMDGPU Linux drivers (21.10). Running darktable-cltest, I found that the OpenCL compilation was failing (the Intel IGC however managed to compile all kernels successfully). The issue was present even when I downgraded to an older version of AMDGPU (20.45). It failed with this error: ``` 0.407486 [opencl_load_program] could not load cached binary program, trying to compile source 0.407508 [opencl_load_program] successfully loaded program from '/usr/share/darktable/kernels/basic.cl' MD5: '23176e22ddb4db27410265238c646fbf' 0.452182 [opencl_build_program] could not build program: -11 0.452194 [opencl_build_program] BUILD STATUS: -2 0.452204 BUILD LOG: 0.452207 "/usr/share/darktable/kernels/color_conversion.h", line 69: error: a type qualifier is not allowed inline float lookup(read_only const image2d_t lut, const float x) ^ ``` Investigating further, I found that this is in accordance with the OpenCL specifications. According to https://www.khronos.org/registry/OpenCL/specs/3.0-unified/html/OpenCL_C.html#restrictions: > The type qualifiers const, restrict and volatile as defined by the C99 specification are supported. These qualifiers cannot be used with image2d_t, image3d_t, image2d_array_t, image2d_depth_t, image2d_array_depth_t, image1d_t, image1d_buffer_t and image1d_array_t types. Apparently, `image2d_t` cannot be of type `const`. Replacing all instances of `const image2d_t` with `image2d_t` in the OpenCL kernels fixed all the errors during kernel compilation.
process
const type qualifiers for t breaks opencl kernel compilation i just installed the latest amdgpu linux drivers running darktable cltest i found that the opencl compilation was failing the intel igc however managed to compile all kernels successfully the issue was present even when i downgraded to an older version of amdgpu it failed with this error could not load cached binary program trying to compile source successfully loaded program from usr share darktable kernels basic cl could not build program build status build log usr share darktable kernels color conversion h line error a type qualifier is not allowed inline float lookup read only const t lut const float x investigating further i found that this is in accordance with the opencl specifications according to the type qualifiers const restrict and volatile as defined by the specification are supported these qualifiers cannot be used with t t array t depth t array depth t t buffer t and array t types apparently t cannot be of type const replacing all instances of const t with t in the opencl kernels fixed all the errors during kernel compilation
1
140,799
18,920,840,884
IssuesEvent
2021-11-17 01:19:58
TreyM-WSS/concord
https://api.github.com/repos/TreyM-WSS/concord
opened
CVE-2021-3777 (High) detected in tmpl-1.0.4.tgz
security vulnerability
## CVE-2021-3777 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary> <p>JavaScript micro templates.</p> <p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p> <p>Path to dependency file: concord/console2/package.json</p> <p>Path to vulnerable library: concord/console2/node_modules/tmpl/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.1.tgz (Root Library) - babel-jest-24.9.0.tgz - transform-24.9.0.tgz - jest-haste-map-24.9.0.tgz - walker-1.0.7.tgz - makeerror-1.0.11.tgz - :x: **tmpl-1.0.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/TreyM-WSS/concord/commit/813d76939d588ba0e6ad41d1ea02343eb32e21c1">813d76939d588ba0e6ad41d1ea02343eb32e21c1</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5">https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5</a></p> <p>Release Date: 2021-09-15</p> <p>Fix Resolution: tmpl - 1.0.5</p> </p> </details> <p></p>
True
CVE-2021-3777 (High) detected in tmpl-1.0.4.tgz - ## CVE-2021-3777 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary> <p>JavaScript micro templates.</p> <p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p> <p>Path to dependency file: concord/console2/package.json</p> <p>Path to vulnerable library: concord/console2/node_modules/tmpl/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.1.tgz (Root Library) - babel-jest-24.9.0.tgz - transform-24.9.0.tgz - jest-haste-map-24.9.0.tgz - walker-1.0.7.tgz - makeerror-1.0.11.tgz - :x: **tmpl-1.0.4.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/TreyM-WSS/concord/commit/813d76939d588ba0e6ad41d1ea02343eb32e21c1">813d76939d588ba0e6ad41d1ea02343eb32e21c1</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5">https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5</a></p> <p>Release Date: 2021-09-15</p> <p>Fix Resolution: tmpl - 1.0.5</p> </p> </details> <p></p>
non_process
cve high detected in tmpl tgz cve high severity vulnerability vulnerable library tmpl tgz javascript micro templates library home page a href path to dependency file concord package json path to vulnerable library concord node modules tmpl package json dependency hierarchy react scripts tgz root library babel jest tgz transform tgz jest haste map tgz walker tgz makeerror tgz x tmpl tgz vulnerable library found in head commit a href vulnerability details nodejs tmpl is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tmpl
0
22,122
30,666,531,058
IssuesEvent
2023-07-25 18:43:45
GSA/EDX
https://api.github.com/repos/GSA/EDX
closed
Update Airtable based on Q4 edxcli scans
process reporting website tracking
Update the Airtable tab which tracks website performance (data comes from the edxcli scan). Instructions on the upload can be found [here](https://github.com/GSA/EDX/tree/main/Tools/edxcli#4-copy-data-into-airtable:~:text=folder%20within%20Drive-,4.%20Copy%20data%20into%20Airtable,-Import%20results%20of).
1.0
Update Airtable based on Q4 edxcli scans - Update the Airtable tab which tracks website performance (data comes from the edxcli scan). Instructions on the upload can be found [here](https://github.com/GSA/EDX/tree/main/Tools/edxcli#4-copy-data-into-airtable:~:text=folder%20within%20Drive-,4.%20Copy%20data%20into%20Airtable,-Import%20results%20of).
process
update airtable based on edxcli scans update the airtable tab which tracks website performance data comes from the edxcli scan instructions on the upload can be found
1
813,819
30,474,048,633
IssuesEvent
2023-07-17 15:18:29
alkem-io/server
https://api.github.com/repos/alkem-io/server
closed
BUG: Demo spaces / challenges are getting the same weight in results as active spaces
bug server User High Priority
**Describe the bug** Go to production + search on the term Impact. The score for love-alliance is 10, it should be 5: ![image.png](https://images.zenhubusercontent.com/5ecb9817aeb589610134bd43/89200570-2182-4078-af96-3e6730dad51c) **Expected behavior** Demo spaces should get half the points for a match, not the full points.
1.0
BUG: Demo spaces / challenges are getting the same weight in results as active spaces - **Describe the bug** Go to production + search on the term Impact. The score for love-alliance is 10, it should be 5: ![image.png](https://images.zenhubusercontent.com/5ecb9817aeb589610134bd43/89200570-2182-4078-af96-3e6730dad51c) **Expected behavior** Demo spaces should get half the points for a match, not the full points.
non_process
bug demo spaces challenges are getting the same weight in results as active spaces describe the bug go to production search on the term impact the score for love alliance is it should be expected behavior demo spaces should get half the points for a match not the full points
0
221,800
17,027,459,840
IssuesEvent
2021-07-03 21:02:30
Riverscapes/riverscapes-tools
https://api.github.com/repos/Riverscapes/riverscapes-tools
opened
VBET Metrics
documentation question
@philipbaileynar and @wally-mac, I've attached what I have so far for these metrics. Any feedback would be great, especially when it comes to the methods. It's pretty rough, but am I including enough information for this to be useful? Do I need to add more? Add less? Let me know. [Metrics.xlsx](https://github.com/Riverscapes/riverscapes-tools/files/6759002/Metrics.xlsx) [Calculation Methods.docx](https://github.com/Riverscapes/riverscapes-tools/files/6759003/Calculation.Methods.docx) I also have some general questions that various folks might have answers to. 1. I am having trouble with the [proportion VB by network category metric](https://github.com/Riverscapes/riverscapes-tools/discussions/108#:~:text=By%20network/riverscape%20category%20(e.g.%20perennial%2C%20intermittent%2C%20i.e.%20NHD%20FCODE)). Since the VBET outputs aren't very segmented, I haven't been able to find a way to select areas of the valley bottom that intersect with unique network types. @shelbysawyer and @lauren-herbine, it looks like there has been some discussion about how we can segment the valley bottom and I was wondering if you had access to a VBET output that has been segmented as laid out [here](https://github.com/Riverscapes/riverscapes-tools/discussions/108#discussioncomment-950491). 2. The historic LANDFIRE raster does not have a Veg Class attribute field. Is there another attribute I should use such as the 'GROUPVEG'? 3. How does Riverscape Length differ from Channel Length? Furthermore, what do we mean by Integrated Riverscape Width mean? 4. Should there be a Land Use Intensity raster, and if so where could I find it?
1.0
VBET Metrics - @philipbaileynar and @wally-mac, I've attached what I have so far for these metrics. Any feedback would be great, especially when it comes to the methods. It's pretty rough, but am I including enough information for this to be useful? Do I need to add more? Add less? Let me know. [Metrics.xlsx](https://github.com/Riverscapes/riverscapes-tools/files/6759002/Metrics.xlsx) [Calculation Methods.docx](https://github.com/Riverscapes/riverscapes-tools/files/6759003/Calculation.Methods.docx) I also have some general questions that various folks might have answers to. 1. I am having trouble with the [proportion VB by network category metric](https://github.com/Riverscapes/riverscapes-tools/discussions/108#:~:text=By%20network/riverscape%20category%20(e.g.%20perennial%2C%20intermittent%2C%20i.e.%20NHD%20FCODE)). Since the VBET outputs aren't very segmented, I haven't been able to find a way to select areas of the valley bottom that intersect with unique network types. @shelbysawyer and @lauren-herbine, it looks like there has been some discussion about how we can segment the valley bottom and I was wondering if you had access to a VBET output that has been segmented as laid out [here](https://github.com/Riverscapes/riverscapes-tools/discussions/108#discussioncomment-950491). 2. The historic LANDFIRE raster does not have a Veg Class attribute field. Is there another attribute I should use such as the 'GROUPVEG'? 3. How does Riverscape Length differ from Channel Length? Furthermore, what do we mean by Integrated Riverscape Width mean? 4. Should there be a Land Use Intensity raster, and if so where could I find it?
non_process
vbet metrics philipbaileynar and wally mac i ve attached what i have so far for these metrics any feedback would be great especially when it comes to the methods it s pretty rough but am i including enough information for this to be useful do i need to add more add less let me know i also have some general questions that various folks might have answers to i am having trouble with the since the vbet outputs aren t very segmented i haven t been able to find a way to select areas of the valley bottom that intersect with unique network types shelbysawyer and lauren herbine it looks like there has been some discussion about how we can segment the valley bottom and i was wondering if you had access to a vbet output that has been segmented as laid out the historic landfire raster does not have a veg class attribute field is there another attribute i should use such as the groupveg how does riverscape length differ from channel length furthermore what do we mean by integrated riverscape width mean should there be a land use intensity raster and if so where could i find it
0
133,397
5,202,535,556
IssuesEvent
2017-01-24 09:52:25
Promact/promact-oauth-server
https://api.github.com/repos/Promact/promact-oauth-server
closed
Restructure OAuth External Login Flow
Done high-priority OAuth Ready Support
Right now its not flowing the rule of OAuth 2.0 flow. So I need to restructure the OAuth flow and implement flow as OAuth 2.0 rules. Following this link - https://www.digitalocean.com/community/tutorials/an-introduction-to-oauth-2 This is the current flow: 1. When user click the link, Login with Promact, then it will call the point 2 2. Authorization Code Link : The above link call a API and it will redirect to https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId=PromactAppClientId 3. User Authorizes Application : After successfully completion of point 2, server will redirect to OAuth server. In OAuth checking of app details with clientId will be done then user need to login. 4. OAuth Request to Application : After successfully completion of point 3, An request(server side call) will be send from OAuth to application with refresh token for getting app’s secret and redirect Uri then OAuth will receive the response and check for app secret. 5. Application Receives User Details and Access token : After successfully completion of point 4, Application will receive user details and add user to application and store the data. My proposal for new structure: 1. When user click the link, Login with Promact, then it will call the point 2. 2. Authorization Code Link : The above link call a API and it will redirect to https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId=PromactAppClientId&redirectUri=redirectUri Example: https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId={PromactAppClientId}&redirectUri=”https://promactslack.azurewebsites.net/OAuth/Authorization?code={AuthorizationCode}” PromactAppClientId = Promact app’s client Id redirectUri = external login call Url 3. User Authorizes Application : After successfully completion of point 2, server will redirect to OAuth server. In OAuth checking of app details with clientId and redirectUri will be done then user need to login. 4. Application Receives Authorization Code : After successfully completion of point 3, server will redirect to Point 2’s redirectUri with authorization code. Example : https://promactslack.azurewebsites.net/OAuth/Authorization?code=DFSD45FGD45FG11 5. Application Requests Access Token : After successfully completion of point 4, Server will call a API and application will request the OAuth for access token and user details Http Request Url will be like this https://promactoauth.azurewebsites.net/OAuth/AccessToken?clientId=PromactAppClientId&authorizationcode=AuthorizationCode&clientSecret=PromactAppClientSecret Example : https://promactoauth.azurewebsites.net/OAuth/AccessToken?clientId={PromactAppClientId}&authorizationcode={AuthorizationCode}&clientSecret={PromactAppClientSecret} PromactAppClientId=Promact app’s clientId AuthorizationCode = Point 4 code PromactAppClientSecret – Promact app’s client secret 6. Application Receives Access Token : After successfully completion of point 5 and authorization code, client secret and details match then, Application will receive all required details of user and create and user for external login and store its access token of Promact.
1.0
Restructure OAuth External Login Flow - Right now its not flowing the rule of OAuth 2.0 flow. So I need to restructure the OAuth flow and implement flow as OAuth 2.0 rules. Following this link - https://www.digitalocean.com/community/tutorials/an-introduction-to-oauth-2 This is the current flow: 1. When user click the link, Login with Promact, then it will call the point 2 2. Authorization Code Link : The above link call a API and it will redirect to https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId=PromactAppClientId 3. User Authorizes Application : After successfully completion of point 2, server will redirect to OAuth server. In OAuth checking of app details with clientId will be done then user need to login. 4. OAuth Request to Application : After successfully completion of point 3, An request(server side call) will be send from OAuth to application with refresh token for getting app’s secret and redirect Uri then OAuth will receive the response and check for app secret. 5. Application Receives User Details and Access token : After successfully completion of point 4, Application will receive user details and add user to application and store the data. My proposal for new structure: 1. When user click the link, Login with Promact, then it will call the point 2. 2. Authorization Code Link : The above link call a API and it will redirect to https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId=PromactAppClientId&redirectUri=redirectUri Example: https://promactoauth.azurewebsites.net/OAuth/ExternalLogin?clientId={PromactAppClientId}&redirectUri=”https://promactslack.azurewebsites.net/OAuth/Authorization?code={AuthorizationCode}” PromactAppClientId = Promact app’s client Id redirectUri = external login call Url 3. User Authorizes Application : After successfully completion of point 2, server will redirect to OAuth server. In OAuth checking of app details with clientId and redirectUri will be done then user need to login. 4. Application Receives Authorization Code : After successfully completion of point 3, server will redirect to Point 2’s redirectUri with authorization code. Example : https://promactslack.azurewebsites.net/OAuth/Authorization?code=DFSD45FGD45FG11 5. Application Requests Access Token : After successfully completion of point 4, Server will call a API and application will request the OAuth for access token and user details Http Request Url will be like this https://promactoauth.azurewebsites.net/OAuth/AccessToken?clientId=PromactAppClientId&authorizationcode=AuthorizationCode&clientSecret=PromactAppClientSecret Example : https://promactoauth.azurewebsites.net/OAuth/AccessToken?clientId={PromactAppClientId}&authorizationcode={AuthorizationCode}&clientSecret={PromactAppClientSecret} PromactAppClientId=Promact app’s clientId AuthorizationCode = Point 4 code PromactAppClientSecret – Promact app’s client secret 6. Application Receives Access Token : After successfully completion of point 5 and authorization code, client secret and details match then, Application will receive all required details of user and create and user for external login and store its access token of Promact.
non_process
restructure oauth external login flow right now its not flowing the rule of oauth flow so i need to restructure the oauth flow and implement flow as oauth rules following this link this is the current flow when user click the link login with promact then it will call the point authorization code link the above link call a api and it will redirect to user authorizes application after successfully completion of point server will redirect to oauth server in oauth checking of app details with clientid will be done then user need to login oauth request to application after successfully completion of point an request server side call will be send from oauth to application with refresh token for getting app’s secret and redirect uri then oauth will receive the response and check for app secret application receives user details and access token after successfully completion of point application will receive user details and add user to application and store the data my proposal for new structure when user click the link login with promact then it will call the point authorization code link the above link call a api and it will redirect to example promactappclientid promact app’s client id redirecturi external login call url user authorizes application after successfully completion of point server will redirect to oauth server in oauth checking of app details with clientid and redirecturi will be done then user need to login application receives authorization code after successfully completion of point server will redirect to point ’s redirecturi with authorization code example application requests access token after successfully completion of point server will call a api and application will request the oauth for access token and user details http request url will be like this example promactappclientid promact app’s clientid authorizationcode point code promactappclientsecret – promact app’s client secret application receives access token after successfully completion of point and authorization code client secret and details match then application will receive all required details of user and create and user for external login and store its access token of promact
0
10,242
13,099,331,331
IssuesEvent
2020-08-03 21:22:08
unicode-org/icu4x
https://api.github.com/repos/unicode-org/icu4x
opened
Multi-layered directory structure
C-process T-question discuss
In #18 we decided to make a top-level `/components` directory. I like this, but I'm also thinking that it might be good to have an extra layer of abstraction. We've come across several types of components so far: 1. Core i18n components: Locale, PluralRules, NumberFormat, etc. 2. Non-i18n utilities: FixedDecimal, Writeable, etc. 3. Data-related code: DataProvider, CldrJsonDataProvider, etc. 4. Data dump (JSON resources) How should we structure these in the repository? FYI, ICU4C is split into four categories: 1. common (UnicodeString, UnicodeSet, etc.) 2. i18n (NumberFormat, PluralRules, etc.) 3. io (u_sprintf, etc.) 4. layoutex (deprecated layout engine) Here's one possible layout, with an emphasis on structure: - `/` - `components/` - `locales/` - `language-info/` (crate) - `language-matcher/` (crate) - `locale/` (crate) - `numbers/` - `fixed-decimal/` (crate) - `number-format/` (crate) - `unicode/` - `unicode-set/` (crate) - `unicode-props/` (crate) - `udata/` - `cldr-json-data-provider/` (crate) - `data-provider/` (crate) - `fs-data-provider/` (crate) - `fs-data-exporter/` (crate) - `strings/` - `writeable/` (crate) - `resources/` - `json/` (root of JSON data directory, latest version) Here's another possible layout, with an emphasis on flatness: - `/` - `components/` - `language-info/` (crate) - `language-matcher/` (crate) - `locale/` (crate) - `number-format/` (crate) - `unicode-props/` (crate) - `udata/` - `cldr-json-data-provider/` (crate) - `data-provider/` (crate) - `fs-data-provider/` (crate) - `fs-data-exporter/` (crate) - `json-data/` (root of JSON data directory, latest version) - `utils/` - `writeable/` (crate) - `fixed-decimal/` (crate) - `unicode-set/` (crate) Thoughts? @Manishearth
1.0
Multi-layered directory structure - In #18 we decided to make a top-level `/components` directory. I like this, but I'm also thinking that it might be good to have an extra layer of abstraction. We've come across several types of components so far: 1. Core i18n components: Locale, PluralRules, NumberFormat, etc. 2. Non-i18n utilities: FixedDecimal, Writeable, etc. 3. Data-related code: DataProvider, CldrJsonDataProvider, etc. 4. Data dump (JSON resources) How should we structure these in the repository? FYI, ICU4C is split into four categories: 1. common (UnicodeString, UnicodeSet, etc.) 2. i18n (NumberFormat, PluralRules, etc.) 3. io (u_sprintf, etc.) 4. layoutex (deprecated layout engine) Here's one possible layout, with an emphasis on structure: - `/` - `components/` - `locales/` - `language-info/` (crate) - `language-matcher/` (crate) - `locale/` (crate) - `numbers/` - `fixed-decimal/` (crate) - `number-format/` (crate) - `unicode/` - `unicode-set/` (crate) - `unicode-props/` (crate) - `udata/` - `cldr-json-data-provider/` (crate) - `data-provider/` (crate) - `fs-data-provider/` (crate) - `fs-data-exporter/` (crate) - `strings/` - `writeable/` (crate) - `resources/` - `json/` (root of JSON data directory, latest version) Here's another possible layout, with an emphasis on flatness: - `/` - `components/` - `language-info/` (crate) - `language-matcher/` (crate) - `locale/` (crate) - `number-format/` (crate) - `unicode-props/` (crate) - `udata/` - `cldr-json-data-provider/` (crate) - `data-provider/` (crate) - `fs-data-provider/` (crate) - `fs-data-exporter/` (crate) - `json-data/` (root of JSON data directory, latest version) - `utils/` - `writeable/` (crate) - `fixed-decimal/` (crate) - `unicode-set/` (crate) Thoughts? @Manishearth
process
multi layered directory structure in we decided to make a top level components directory i like this but i m also thinking that it might be good to have an extra layer of abstraction we ve come across several types of components so far core components locale pluralrules numberformat etc non utilities fixeddecimal writeable etc data related code dataprovider cldrjsondataprovider etc data dump json resources how should we structure these in the repository fyi is split into four categories common unicodestring unicodeset etc numberformat pluralrules etc io u sprintf etc layoutex deprecated layout engine here s one possible layout with an emphasis on structure components locales language info crate language matcher crate locale crate numbers fixed decimal crate number format crate unicode unicode set crate unicode props crate udata cldr json data provider crate data provider crate fs data provider crate fs data exporter crate strings writeable crate resources json root of json data directory latest version here s another possible layout with an emphasis on flatness components language info crate language matcher crate locale crate number format crate unicode props crate udata cldr json data provider crate data provider crate fs data provider crate fs data exporter crate json data root of json data directory latest version utils writeable crate fixed decimal crate unicode set crate thoughts manishearth
1
533
3,000,101,647
IssuesEvent
2015-07-23 22:39:59
zhengj2007/BFO-test
https://api.github.com/repos/zhengj2007/BFO-test
closed
The svn check out file is not available
imported Type-BFO2-Process Usability wontfix
_From [linik...@gmail.com](https://code.google.com/u/109800199443061799038/) on September 10, 2012 15:02:19_ I am trying to download BFO2 to have a look. From the source menu, it is said that 'svn checkout http://bfo.googlecode.com/svn/trunk/ bfo-read-only' However there is no bfo-read-only file available to check out. I did download all the folders under trunk/ though. There are a lot of files there. Have no clue which one is the correct one to check. Could you please fix the link? Thanks, Asiyah _Original issue: http://code.google.com/p/bfo/issues/detail?id=126_
1.0
The svn check out file is not available - _From [linik...@gmail.com](https://code.google.com/u/109800199443061799038/) on September 10, 2012 15:02:19_ I am trying to download BFO2 to have a look. From the source menu, it is said that 'svn checkout http://bfo.googlecode.com/svn/trunk/ bfo-read-only' However there is no bfo-read-only file available to check out. I did download all the folders under trunk/ though. There are a lot of files there. Have no clue which one is the correct one to check. Could you please fix the link? Thanks, Asiyah _Original issue: http://code.google.com/p/bfo/issues/detail?id=126_
process
the svn check out file is not available from on september i am trying to download to have a look from the source menu it is said that svn checkout bfo read only however there is no bfo read only file available to check out i did download all the folders under trunk though there are a lot of files there have no clue which one is the correct one to check could you please fix the link thanks asiyah original issue
1
155,598
24,488,336,806
IssuesEvent
2022-10-09 18:42:13
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Make it easier to create reusable rule handlers for dynamics
kind/feature-request status/wontfix solution/studio/designer area/logic org/dat
## Is your feature request related to a problem? Please describe It is inconvenient to not have access to other input types other than references to the form fields. One example could be that you want to match a field content to a given text and render other fields based on if text matches or not. Today this causes me to create a lot of "duplicated" code like `textEqualsAS`, `textEqualsANS`, `textEqualsNUF` and etc, where it could be better solved with a generic `textEquals` method that could have self defined input string. ## Describe the solution you'd like - [ ] Add a name field for each conditional rendering rule (since it will be hard to distinguish multiple rules with the same method behind) and use this where the rules are listed. - [ ] Allow to define custom input type where the altinn studio user could enter a string (maybe number and bool also?) - [ ] Document how you can share and use a set of reusable rules/code. Create a some examples to start with, find a spot to put these where people can find it and encourage users to contribute. ## Describe alternatives you've considered - Learn our altinn studio users to add the needed code themselves (risk of introducing errors) - Just add the duplicate code (harder to maintain) ## Additional context **Mockup image:** ![image](https://user-images.githubusercontent.com/554713/102601394-67226b80-4120-11eb-891a-f77836628524.png) **Example of how the conditional rule handler could look:** ``` var conditionalRuleHandlerHelper = { textEquals: () => { return { actual: { label: "Faktisk", type: "input" }, expected: { label: "Forventet", type: "string" } }; }, } ``` by checking type for object/string this could be made backwards compatible. **Minimal example on how to add in shared code:** ![image](https://user-images.githubusercontent.com/554713/102601726-db5d0f00-4120-11eb-969d-ec67e0d390c5.png)
1.0
Make it easier to create reusable rule handlers for dynamics - ## Is your feature request related to a problem? Please describe It is inconvenient to not have access to other input types other than references to the form fields. One example could be that you want to match a field content to a given text and render other fields based on if text matches or not. Today this causes me to create a lot of "duplicated" code like `textEqualsAS`, `textEqualsANS`, `textEqualsNUF` and etc, where it could be better solved with a generic `textEquals` method that could have self defined input string. ## Describe the solution you'd like - [ ] Add a name field for each conditional rendering rule (since it will be hard to distinguish multiple rules with the same method behind) and use this where the rules are listed. - [ ] Allow to define custom input type where the altinn studio user could enter a string (maybe number and bool also?) - [ ] Document how you can share and use a set of reusable rules/code. Create a some examples to start with, find a spot to put these where people can find it and encourage users to contribute. ## Describe alternatives you've considered - Learn our altinn studio users to add the needed code themselves (risk of introducing errors) - Just add the duplicate code (harder to maintain) ## Additional context **Mockup image:** ![image](https://user-images.githubusercontent.com/554713/102601394-67226b80-4120-11eb-891a-f77836628524.png) **Example of how the conditional rule handler could look:** ``` var conditionalRuleHandlerHelper = { textEquals: () => { return { actual: { label: "Faktisk", type: "input" }, expected: { label: "Forventet", type: "string" } }; }, } ``` by checking type for object/string this could be made backwards compatible. **Minimal example on how to add in shared code:** ![image](https://user-images.githubusercontent.com/554713/102601726-db5d0f00-4120-11eb-969d-ec67e0d390c5.png)
non_process
make it easier to create reusable rule handlers for dynamics is your feature request related to a problem please describe it is inconvenient to not have access to other input types other than references to the form fields one example could be that you want to match a field content to a given text and render other fields based on if text matches or not today this causes me to create a lot of duplicated code like textequalsas textequalsans textequalsnuf and etc where it could be better solved with a generic textequals method that could have self defined input string describe the solution you d like add a name field for each conditional rendering rule since it will be hard to distinguish multiple rules with the same method behind and use this where the rules are listed allow to define custom input type where the altinn studio user could enter a string maybe number and bool also document how you can share and use a set of reusable rules code create a some examples to start with find a spot to put these where people can find it and encourage users to contribute describe alternatives you ve considered learn our altinn studio users to add the needed code themselves risk of introducing errors just add the duplicate code harder to maintain additional context mockup image example of how the conditional rule handler could look var conditionalrulehandlerhelper textequals return actual label faktisk type input expected label forventet type string by checking type for object string this could be made backwards compatible minimal example on how to add in shared code
0