Unnamed: 0
int64
3
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
2
742
labels
stringlengths
4
431
body
stringlengths
5
239k
index
stringclasses
10 values
text_combine
stringlengths
96
240k
label
stringclasses
2 values
text
stringlengths
96
200k
binary_label
int64
0
1
449,460
12,968,776,009
IssuesEvent
2020-07-21 06:31:44
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
accounts.firefox.com - design is broken
browser-firefox-mobile engine-gecko priority-normal
<!-- @browser: Firefox Mobile 79.0 --> <!-- @ua_header: Mozilla/5.0 (Android 5.1; Mobile; rv:79.0) Gecko/79.0 Firefox/79.0 --> <!-- @reported_with: desktop-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/55632 --> **URL**: https://accounts.firefox.com/reset_password_verified?uid=a3b218bdba884245895fcc0de0f81f3b&token=8f8f8e95fb953a243ad7cc876fcc95ba340217985fd275c62b0044a3a24b0f12&code=1636968044d6c63a6300d84b3769efba&email=osipovich.dim%40gmail.com&service=a2270f727f45f648&resume=eyJkZXZpY2VJZCI6ImNhZjRjYWVjODg0ODQzMDlhMjk0NDFhZmVlYzE4YmVlIiwiZW1haWwiOiJvc2lwb3ZpY2guZGltQGdtYWlsLmNvbSIsImVudHJ5cG9pbnQiOm51bGwsImVudHJ5cG9pbnRFeHBlcmltZW50IjpudWxsLCJlbnRyeXBvaW50VmFyaWF0aW9uIjpudWxsLCJmbG93QmVnaW4iOjE1OTUxNjMxMDA0NDEsImZsb3dJZCI6IjI1MDg2NzVhN2U5OThmNTIwMzY5OGRiMDk5NjI1MDliMWI2MzQ5NjE0NDM1ZGI1NTBlZDU0Yzc0YjdkOWIxMWEiLCJwbGFuSWQiOm51bGwsInByb2R1Y3RJZCI6bnVsbCwicmVzZXRQYXNzd29yZENvbmZpcm0iOnRydWUsInN0eWxlIjpudWxsLCJ1bmlxdWVVc2VySWQiOiI4YTlmODFmZi01MmE0LTQ3NzctOWI2OC03NTU2YWFkNTNmOWUiLCJ1dG1DYW1wYWlnbiI6bnVsbCwidXRtQ29udGVudCI6bnVsbCwidXRtTWVkaXVtIjpudWxsLCJ1dG1Tb3VyY2UiOm51bGwsInV0bVRlcm0iOm51bGx9&emailToHashWith=osipovich.dim%40gmail.com&utm_medium=email&utm_campaign=fx-forgot-password&utm_content=fx-reset-password **Browser / Version**: Firefox Mobile 79.0 **Operating System**: Android 5.1 **Tested Another Browser**: Yes Chrome **Problem type**: Design is broken **Description**: Items are overlapped **Steps to Reproduce**: Web app not working.video no reload <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20200713203149</li><li>channel: beta</li><li>hasTouchScreen: true</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
1.0
accounts.firefox.com - design is broken - <!-- @browser: Firefox Mobile 79.0 --> <!-- @ua_header: Mozilla/5.0 (Android 5.1; Mobile; rv:79.0) Gecko/79.0 Firefox/79.0 --> <!-- @reported_with: desktop-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/55632 --> **URL**: https://accounts.firefox.com/reset_password_verified?uid=a3b218bdba884245895fcc0de0f81f3b&token=8f8f8e95fb953a243ad7cc876fcc95ba340217985fd275c62b0044a3a24b0f12&code=1636968044d6c63a6300d84b3769efba&email=osipovich.dim%40gmail.com&service=a2270f727f45f648&resume=eyJkZXZpY2VJZCI6ImNhZjRjYWVjODg0ODQzMDlhMjk0NDFhZmVlYzE4YmVlIiwiZW1haWwiOiJvc2lwb3ZpY2guZGltQGdtYWlsLmNvbSIsImVudHJ5cG9pbnQiOm51bGwsImVudHJ5cG9pbnRFeHBlcmltZW50IjpudWxsLCJlbnRyeXBvaW50VmFyaWF0aW9uIjpudWxsLCJmbG93QmVnaW4iOjE1OTUxNjMxMDA0NDEsImZsb3dJZCI6IjI1MDg2NzVhN2U5OThmNTIwMzY5OGRiMDk5NjI1MDliMWI2MzQ5NjE0NDM1ZGI1NTBlZDU0Yzc0YjdkOWIxMWEiLCJwbGFuSWQiOm51bGwsInByb2R1Y3RJZCI6bnVsbCwicmVzZXRQYXNzd29yZENvbmZpcm0iOnRydWUsInN0eWxlIjpudWxsLCJ1bmlxdWVVc2VySWQiOiI4YTlmODFmZi01MmE0LTQ3NzctOWI2OC03NTU2YWFkNTNmOWUiLCJ1dG1DYW1wYWlnbiI6bnVsbCwidXRtQ29udGVudCI6bnVsbCwidXRtTWVkaXVtIjpudWxsLCJ1dG1Tb3VyY2UiOm51bGwsInV0bVRlcm0iOm51bGx9&emailToHashWith=osipovich.dim%40gmail.com&utm_medium=email&utm_campaign=fx-forgot-password&utm_content=fx-reset-password **Browser / Version**: Firefox Mobile 79.0 **Operating System**: Android 5.1 **Tested Another Browser**: Yes Chrome **Problem type**: Design is broken **Description**: Items are overlapped **Steps to Reproduce**: Web app not working.video no reload <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20200713203149</li><li>channel: beta</li><li>hasTouchScreen: true</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with โค๏ธ_
non_usab
accounts firefox com design is broken url browser version firefox mobile operating system android tested another browser yes chrome problem type design is broken description items are overlapped steps to reproduce web app not working video no reload browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen true from with โค๏ธ
0
4,864
3,897,238,997
IssuesEvent
2016-04-16 09:02:42
lionheart/openradar-mirror
https://api.github.com/repos/lionheart/openradar-mirror
opened
15759378: Xcode 5.0.2: Using โŒ˜-D to duplicate a UITableViewController embedded in a UINavigationController in IB makes UINavigationItem of duplicate "unselectable"
classification:ui/usability reproducible:always status:open
#### Description Summary: Using โŒ˜-D to duplicate a UITableViewController embedded in a UINavigationController in IB makes UINavigationItem of duplicate "unselectable" in the canvas window until one closes and reopens the project. Steps to Reproduce: 1. Open IB 2. Create a UITableViewController 3. Embed in a UINavigationController 4. Drag a bar button to the right button slot 5. Duplicate (โŒ˜-D) the UITableViewController 6. Control-drag from the right bar button on the first controller to PUSH segue to the new controller 7. Drag a bar button to the right button slot on the new controller Expected Results: Similar to the working result after #4 Actual Results: No visual indication of ability to drag and drop button. In fact, if one DOES drop the button it is placed in IB as a toolbar button, NOT a navbar button. Notes: Mavericks 10.9.1 Closing the project and re-opening works around the issue - Product Version: Xcode 5.0.2 (5A3005) Created: 2014-01-07T02:28:24.447718 Originated: 2014-01-07T13:28:00 Open Radar Link: http://www.openradar.me/15759378
True
15759378: Xcode 5.0.2: Using โŒ˜-D to duplicate a UITableViewController embedded in a UINavigationController in IB makes UINavigationItem of duplicate "unselectable" - #### Description Summary: Using โŒ˜-D to duplicate a UITableViewController embedded in a UINavigationController in IB makes UINavigationItem of duplicate "unselectable" in the canvas window until one closes and reopens the project. Steps to Reproduce: 1. Open IB 2. Create a UITableViewController 3. Embed in a UINavigationController 4. Drag a bar button to the right button slot 5. Duplicate (โŒ˜-D) the UITableViewController 6. Control-drag from the right bar button on the first controller to PUSH segue to the new controller 7. Drag a bar button to the right button slot on the new controller Expected Results: Similar to the working result after #4 Actual Results: No visual indication of ability to drag and drop button. In fact, if one DOES drop the button it is placed in IB as a toolbar button, NOT a navbar button. Notes: Mavericks 10.9.1 Closing the project and re-opening works around the issue - Product Version: Xcode 5.0.2 (5A3005) Created: 2014-01-07T02:28:24.447718 Originated: 2014-01-07T13:28:00 Open Radar Link: http://www.openradar.me/15759378
usab
xcode using โŒ˜ d to duplicate a uitableviewcontroller embedded in a uinavigationcontroller in ib makes uinavigationitem of duplicate unselectable description summary using โŒ˜ d to duplicate a uitableviewcontroller embedded in a uinavigationcontroller in ib makes uinavigationitem of duplicate unselectable in the canvas window until one closes and reopens the project steps to reproduce open ib create a uitableviewcontroller embed in a uinavigationcontroller drag a bar button to the right button slot duplicate โŒ˜ d the uitableviewcontroller control drag from the right bar button on the first controller to push segue to the new controller drag a bar button to the right button slot on the new controller expected results similar to the working result after actual results no visual indication of ability to drag and drop button in fact if one does drop the button it is placed in ib as a toolbar button not a navbar button notes mavericks closing the project and re opening works around the issue product version xcode created originated open radar link
1
257,063
22,144,015,531
IssuesEvent
2022-06-03 09:56:48
MohistMC/Mohist
https://api.github.com/repos/MohistMC/Mohist
closed
[1.16.5]black mod list option not work properly
1.16.5 Wait Needs Testing
<!-- ISSUE_TEMPLATE_3 -> IMPORTANT: DO NOT DELETE THIS LINE.--> <!-- Thank you for reporting ! Please note that issues can take a lot of time to be fixed and there is no eta.--> <!-- If you don't know where to upload your logs and crash reports, you can use these websites : --> <!-- https://gist.github.com (recommended) --> <!-- https://mclo.gs --> <!-- https://haste.mohistmc.com --> <!-- https://pastebin.com --> <!-- TO FILL THIS TEMPLATE, YOU NEED TO REPLACE THE {} BY WHAT YOU WANT --> **Minecraft Version :** 1.16.5 **Mohist Version :** 1.16.5-997 **Operating System :** windows server 2022 **Logs :** none **Mod list :** โ€ข minecraft mohist-1.16.5-997-server.jar : minecraft (1.16.5) - 1 โ€ข maven_libs mohist-1.16.5-997-universal.jar : forge (36.2.35) - 1 **Plugin list :** none **Description of issue :** The option of `modsblacklist`(which is in the mohist.yml) may not work properly, when the `list` parameter accept a list, the element in the list can't be matched by single, which I mean, if I set the parameter like this: ``` forge: modsblacklist: enable: true list: xray,autoattack kickmessage: Use of unauthorized mods ``` or ``` forge: modsblacklist: enable: true list: - xray - autoattact kickmessage: Use of unauthorized mods ``` as well as this after reload: ``` forge: modsblacklist: enable: true list: '[xray, autoattact]' kickmessage: Use of unauthorized mods ``` When a client with only `xray` but no `autoattack` loaded, it can pass the blacklist check and join the server, but what I expect is that the client with any mod listed by config should be blocked. And if I set the parameter as below: ``` forge: modsblacklist: enable: true list: xray kickmessage: Use of unauthorized mods ``` It seems work properly that server could block any client with xray mod loaded. So the problem is, how can I set the blacklist compare method to `contains any in list` but not `equal all`?
1.0
[1.16.5]black mod list option not work properly - <!-- ISSUE_TEMPLATE_3 -> IMPORTANT: DO NOT DELETE THIS LINE.--> <!-- Thank you for reporting ! Please note that issues can take a lot of time to be fixed and there is no eta.--> <!-- If you don't know where to upload your logs and crash reports, you can use these websites : --> <!-- https://gist.github.com (recommended) --> <!-- https://mclo.gs --> <!-- https://haste.mohistmc.com --> <!-- https://pastebin.com --> <!-- TO FILL THIS TEMPLATE, YOU NEED TO REPLACE THE {} BY WHAT YOU WANT --> **Minecraft Version :** 1.16.5 **Mohist Version :** 1.16.5-997 **Operating System :** windows server 2022 **Logs :** none **Mod list :** โ€ข minecraft mohist-1.16.5-997-server.jar : minecraft (1.16.5) - 1 โ€ข maven_libs mohist-1.16.5-997-universal.jar : forge (36.2.35) - 1 **Plugin list :** none **Description of issue :** The option of `modsblacklist`(which is in the mohist.yml) may not work properly, when the `list` parameter accept a list, the element in the list can't be matched by single, which I mean, if I set the parameter like this: ``` forge: modsblacklist: enable: true list: xray,autoattack kickmessage: Use of unauthorized mods ``` or ``` forge: modsblacklist: enable: true list: - xray - autoattact kickmessage: Use of unauthorized mods ``` as well as this after reload: ``` forge: modsblacklist: enable: true list: '[xray, autoattact]' kickmessage: Use of unauthorized mods ``` When a client with only `xray` but no `autoattack` loaded, it can pass the blacklist check and join the server, but what I expect is that the client with any mod listed by config should be blocked. And if I set the parameter as below: ``` forge: modsblacklist: enable: true list: xray kickmessage: Use of unauthorized mods ``` It seems work properly that server could block any client with xray mod loaded. So the problem is, how can I set the blacklist compare method to `contains any in list` but not `equal all`?
non_usab
black mod list option not work properly important do not delete this line minecraft version mohist version operating system windows server logs none mod list โ€ข minecraft mohist server jar minecraft โ€ข maven libs mohist universal jar forge plugin list none description of issue the option of modsblacklist which is in the mohist yml may not work properly when the list parameter accept a list the element in the list can t be matched by single which i mean if i set the parameter like this forge modsblacklist enable true list xray autoattack kickmessage use of unauthorized mods or forge modsblacklist enable true list xray autoattact kickmessage use of unauthorized mods as well as this after reload forge modsblacklist enable true list kickmessage use of unauthorized mods when a client with only xray but no autoattack loaded it can pass the blacklist check and join the server but what i expect is that the client with any mod listed by config should be blocked and if i set the parameter as below forge modsblacklist enable true list xray kickmessage use of unauthorized mods it seems work properly that server could block any client with xray mod loaded so the problem is how can i set the blacklist compare method to contains any in list but not equal all
0
6,360
2,839,597,562
IssuesEvent
2015-05-27 14:30:20
ScienceCommons/api
https://api.github.com/repos/ScienceCommons/api
closed
For now, do not display authors in search results
sloboda test
This will be cool down the road, but right now, because we still don't have a working author merge tool, there will most often be several different author names showing up each referring to the same author. This is very confusing for the user. For example, https://www.curatescience.org/beta/#/query/pashler%20elderly yields 3 different authors for Hal R. Pashler (Hal Pashler, Harold Pashler, and Hal R. Pashler)
1.0
For now, do not display authors in search results - This will be cool down the road, but right now, because we still don't have a working author merge tool, there will most often be several different author names showing up each referring to the same author. This is very confusing for the user. For example, https://www.curatescience.org/beta/#/query/pashler%20elderly yields 3 different authors for Hal R. Pashler (Hal Pashler, Harold Pashler, and Hal R. Pashler)
non_usab
for now do not display authors in search results this will be cool down the road but right now because we still don t have a working author merge tool there will most often be several different author names showing up each referring to the same author this is very confusing for the user for example yields different authors for hal r pashler hal pashler harold pashler and hal r pashler
0
14,948
9,605,242,110
IssuesEvent
2019-05-10 22:58:45
geneontology/amigo
https://api.github.com/repos/geneontology/amigo
closed
Update old "help" URLs to something functional
bug (B: affects usability)
Currently, all help URLs are set to something defunct (http://geneontology.org/form/contact-go); at least aim them somewhere that is not a deadend while other helpdesk issues are sorted out (http://help.geneontology.org).
True
Update old "help" URLs to something functional - Currently, all help URLs are set to something defunct (http://geneontology.org/form/contact-go); at least aim them somewhere that is not a deadend while other helpdesk issues are sorted out (http://help.geneontology.org).
usab
update old help urls to something functional currently all help urls are set to something defunct at least aim them somewhere that is not a deadend while other helpdesk issues are sorted out
1
260,613
19,678,513,581
IssuesEvent
2022-01-11 14:42:42
Gourmet-Dev/gourmet-apis
https://api.github.com/repos/Gourmet-Dev/gourmet-apis
opened
README ํŒŒ์ผ๊ณผ Issue Template, ๊ทธ๋ฆฌ๊ณ  PR Template ๋ˆ„๋ฝ
documentation
### ๋ฌธ์ œ ์ƒํ™ฉ * README ํŒŒ์ผ์ด ์กด์žฌํ•˜์ง€ ์•Š์Œ * Issue ๋“ฑ๋ก์‹œ, ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ * Pull-Request ์š”์ฒญ์‹œ, ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ --- ### ๊ธฐ๋Œ€ ์ƒํ™ฉ * ๋ ˆํฌ์ง€ํ† ๋ฆฌ์— ๋“ค์–ด์˜ค๋ฉด, ๋กœ์ปฌ ํ™˜๊ฒฝ์—์„œ ์„œ๋น„์Šค๋ฅผ ๋„์šธ ์ˆ˜ ์žˆ๊ฒŒ ๊ธฐ๋ณธ์ ์ธ Getting Started ๋‚ด์šฉ์ด ์žˆ์–ด์•ผ ํ•จ * Issue ๋“ฑ๋ก์‹œ, ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ * Pull-Request ์š”์ฒญ์‹œ, ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ --- ### ์žฌํ˜„ ์กฐ๊ฑด * ์žฌํ˜„ ์กฐ๊ฑด ์—†์Œ
1.0
README ํŒŒ์ผ๊ณผ Issue Template, ๊ทธ๋ฆฌ๊ณ  PR Template ๋ˆ„๋ฝ - ### ๋ฌธ์ œ ์ƒํ™ฉ * README ํŒŒ์ผ์ด ์กด์žฌํ•˜์ง€ ์•Š์Œ * Issue ๋“ฑ๋ก์‹œ, ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ * Pull-Request ์š”์ฒญ์‹œ, ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ --- ### ๊ธฐ๋Œ€ ์ƒํ™ฉ * ๋ ˆํฌ์ง€ํ† ๋ฆฌ์— ๋“ค์–ด์˜ค๋ฉด, ๋กœ์ปฌ ํ™˜๊ฒฝ์—์„œ ์„œ๋น„์Šค๋ฅผ ๋„์šธ ์ˆ˜ ์žˆ๊ฒŒ ๊ธฐ๋ณธ์ ์ธ Getting Started ๋‚ด์šฉ์ด ์žˆ์–ด์•ผ ํ•จ * Issue ๋“ฑ๋ก์‹œ, ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ * Pull-Request ์š”์ฒญ์‹œ, ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ --- ### ์žฌํ˜„ ์กฐ๊ฑด * ์žฌํ˜„ ์กฐ๊ฑด ์—†์Œ
non_usab
readme ํŒŒ์ผ๊ณผ issue template ๊ทธ๋ฆฌ๊ณ  pr template ๋ˆ„๋ฝ ๋ฌธ์ œ ์ƒํ™ฉ readme ํŒŒ์ผ์ด ์กด์žฌํ•˜์ง€ ์•Š์Œ issue ๋“ฑ๋ก์‹œ ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ pull request ์š”์ฒญ์‹œ ํ…œํ”Œ๋ฆฟ์„ ์ง์ ‘ ์ž‘์„ฑํ•ด ๋„ฃ์–ด์•ผ ํ•จ ๊ธฐ๋Œ€ ์ƒํ™ฉ ๋ ˆํฌ์ง€ํ† ๋ฆฌ์— ๋“ค์–ด์˜ค๋ฉด ๋กœ์ปฌ ํ™˜๊ฒฝ์—์„œ ์„œ๋น„์Šค๋ฅผ ๋„์šธ ์ˆ˜ ์žˆ๊ฒŒ ๊ธฐ๋ณธ์ ์ธ getting started ๋‚ด์šฉ์ด ์žˆ์–ด์•ผ ํ•จ issue ๋“ฑ๋ก์‹œ ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ pull request ์š”์ฒญ์‹œ ์—๋””ํ„ฐ์— ์ž๋™์œผ๋กœ ์ •ํ•ด์ง„ ํ…œํ”Œ๋ฆฟ์ด ์ž…๋ ฅ๋˜์–ด ์žˆ์–ด์•ผ ํ•จ ์žฌํ˜„ ์กฐ๊ฑด ์žฌํ˜„ ์กฐ๊ฑด ์—†์Œ
0
12,077
7,686,967,162
IssuesEvent
2018-05-17 02:30:41
MarkBind/markbind
https://api.github.com/repos/MarkBind/markbind
closed
Panels: go from minimized to expanded directly
a-ReaderUsability p.Low
Steps: 1. Minimize a panel using the `x` button 2. Click on the minimized panel Actual: panel goes to collapsed mode. Suggested: go direct to the expanded mode. Reason: It is likely the reader wants to see the content of panel. The proposed improvement saves reader a click.
True
Panels: go from minimized to expanded directly - Steps: 1. Minimize a panel using the `x` button 2. Click on the minimized panel Actual: panel goes to collapsed mode. Suggested: go direct to the expanded mode. Reason: It is likely the reader wants to see the content of panel. The proposed improvement saves reader a click.
usab
panels go from minimized to expanded directly steps minimize a panel using the x button click on the minimized panel actual panel goes to collapsed mode suggested go direct to the expanded mode reason it is likely the reader wants to see the content of panel the proposed improvement saves reader a click
1
15,722
10,263,835,743
IssuesEvent
2019-08-22 15:06:36
coala/coala-bears
https://api.github.com/repos/coala/coala-bears
closed
IndentationBear - Ignore doc comments/doc strings
area/genericbears area/usability difficulty/medium importance/medium
Doc comments don't usually follow standards and we are trying to ignore doc comments within the indentationBear, till then it is best if we tell people to ignore the IndentationBear over doc comments in the inline doc for the IndentationBear
True
IndentationBear - Ignore doc comments/doc strings - Doc comments don't usually follow standards and we are trying to ignore doc comments within the indentationBear, till then it is best if we tell people to ignore the IndentationBear over doc comments in the inline doc for the IndentationBear
usab
indentationbear ignore doc comments doc strings doc comments don t usually follow standards and we are trying to ignore doc comments within the indentationbear till then it is best if we tell people to ignore the indentationbear over doc comments in the inline doc for the indentationbear
1
23,102
21,007,312,014
IssuesEvent
2022-03-30 00:41:47
rabbitmq/rabbitmq-server
https://api.github.com/repos/rabbitmq/rabbitmq-server
closed
Consider making rabbitmqadmin available in PATH for Debian and RPM packages
usability pkg-rpm pkg-deb
I get that you may want to pre-configure it for the right URL, but it would be great if you could ship some kind of version of rabbitmqadmin suitable for config management packages to install or modify (even if it's a template of some sort, or only has default values, e.g., localhost / 15672 in default options). For example, in Puppet, it has to do a curl of the admin interface using a username to connect in order to get the file, which seems like a really kludgy way to do it: https://github.com/puppetlabs/puppetlabs-rabbitmq/blob/master/manifests/install/rabbitmqadmin.pp https://tickets.puppetlabs.com/browse/MODULES-3098
True
Consider making rabbitmqadmin available in PATH for Debian and RPM packages - I get that you may want to pre-configure it for the right URL, but it would be great if you could ship some kind of version of rabbitmqadmin suitable for config management packages to install or modify (even if it's a template of some sort, or only has default values, e.g., localhost / 15672 in default options). For example, in Puppet, it has to do a curl of the admin interface using a username to connect in order to get the file, which seems like a really kludgy way to do it: https://github.com/puppetlabs/puppetlabs-rabbitmq/blob/master/manifests/install/rabbitmqadmin.pp https://tickets.puppetlabs.com/browse/MODULES-3098
usab
consider making rabbitmqadmin available in path for debian and rpm packages i get that you may want to pre configure it for the right url but it would be great if you could ship some kind of version of rabbitmqadmin suitable for config management packages to install or modify even if it s a template of some sort or only has default values e g localhost in default options for example in puppet it has to do a curl of the admin interface using a username to connect in order to get the file which seems like a really kludgy way to do it
1
15,198
9,851,898,854
IssuesEvent
2019-06-19 11:35:09
virtualsatellite/VirtualSatellite4-Core
https://api.github.com/repos/virtualsatellite/VirtualSatellite4-Core
opened
Highlight changed elements after comparison / update
comfort/usability
There was a question in a project meeting whether it is possible to somehow show which components have changed when taking an update from SVN. We see different use cases such as: - Comparing to a baseline revision that is fixed. e.g. the version of today in the morning - Comparing to see al the changes that just came in with the last update. this could eb apreference setting - Comarping to see all my uncommitted changes The idea is different to VirSat3 which means that we dont want to check out an old revision but want to bend the URI handlers to directly load the information from the SVN. It will also need some good UI to spot the differences . Idea: Comparison Editor Investigate possibility to implement a comparison editor that can comapre the model with a given revision from the repository. Functionality could be similar to team->comapre with. maybe the functionality needs to create asummary of changes in the whole model. Investigate the use of EMF compare.
True
Highlight changed elements after comparison / update - There was a question in a project meeting whether it is possible to somehow show which components have changed when taking an update from SVN. We see different use cases such as: - Comparing to a baseline revision that is fixed. e.g. the version of today in the morning - Comparing to see al the changes that just came in with the last update. this could eb apreference setting - Comarping to see all my uncommitted changes The idea is different to VirSat3 which means that we dont want to check out an old revision but want to bend the URI handlers to directly load the information from the SVN. It will also need some good UI to spot the differences . Idea: Comparison Editor Investigate possibility to implement a comparison editor that can comapre the model with a given revision from the repository. Functionality could be similar to team->comapre with. maybe the functionality needs to create asummary of changes in the whole model. Investigate the use of EMF compare.
usab
highlight changed elements after comparison update there was a question in a project meeting whether it is possible to somehow show which components have changed when taking an update from svn we see different use cases such as comparing to a baseline revision that is fixed e g the version of today in the morning comparing to see al the changes that just came in with the last update this could eb apreference setting comarping to see all my uncommitted changes the idea is different to which means that we dont want to check out an old revision but want to bend the uri handlers to directly load the information from the svn it will also need some good ui to spot the differences idea comparison editor investigate possibility to implement a comparison editor that can comapre the model with a given revision from the repository functionality could be similar to team comapre with maybe the functionality needs to create asummary of changes in the whole model investigate the use of emf compare
1
9,497
6,334,040,485
IssuesEvent
2017-07-26 15:51:14
palantir/atlasdb
https://api.github.com/repos/palantir/atlasdb
opened
Unhelpful IllegalArgumentException on timelock revert
component: timelock component: usability
``` Exception in thread "main" java.lang.IllegalArgumentException: array too small: 1 < 8 ``` This comes up when you start up a node which was previously using timelock and then went back to embedded. It was implemented in this way, so that clients that don't even know that timelock doesn't exist can't accidentally violate the timestamp guarantee. See #1596 for context. Probably unhelpful, and also pretty scary if one encounters this in the field without context. We should catch this when creating the timestamp bound store, and rethrow something with a bit more explanation. Internal reference: PDS-55189 (Also note, though, that this is kind of a win, in that our protection mechanism just prevented an instance of data corruption!)
True
Unhelpful IllegalArgumentException on timelock revert - ``` Exception in thread "main" java.lang.IllegalArgumentException: array too small: 1 < 8 ``` This comes up when you start up a node which was previously using timelock and then went back to embedded. It was implemented in this way, so that clients that don't even know that timelock doesn't exist can't accidentally violate the timestamp guarantee. See #1596 for context. Probably unhelpful, and also pretty scary if one encounters this in the field without context. We should catch this when creating the timestamp bound store, and rethrow something with a bit more explanation. Internal reference: PDS-55189 (Also note, though, that this is kind of a win, in that our protection mechanism just prevented an instance of data corruption!)
usab
unhelpful illegalargumentexception on timelock revert exception in thread main java lang illegalargumentexception array too small this comes up when you start up a node which was previously using timelock and then went back to embedded it was implemented in this way so that clients that don t even know that timelock doesn t exist can t accidentally violate the timestamp guarantee see for context probably unhelpful and also pretty scary if one encounters this in the field without context we should catch this when creating the timestamp bound store and rethrow something with a bit more explanation internal reference pds also note though that this is kind of a win in that our protection mechanism just prevented an instance of data corruption
1
20,545
15,681,861,792
IssuesEvent
2021-03-25 06:15:58
microsoft/win32metadata
https://api.github.com/repos/microsoft/win32metadata
closed
CreateIcon missing NativeArray attribute on byte* array parameters
usability
currently the metadata has: ```cs [DllImport("USER32", ExactSpelling = true, SetLastError = true)] public unsafe static extern HICON CreateIcon([Optional][In] HINSTANCE hInstance, [In] int nWidth, [In] int nHeight, [In] byte cPlanes, [In] byte cBitsPixel, [In][Const] byte* lpbANDbits, [In][Const] byte* lpbXORbits); ``` But the last two parameters should be arrays.
True
CreateIcon missing NativeArray attribute on byte* array parameters - currently the metadata has: ```cs [DllImport("USER32", ExactSpelling = true, SetLastError = true)] public unsafe static extern HICON CreateIcon([Optional][In] HINSTANCE hInstance, [In] int nWidth, [In] int nHeight, [In] byte cPlanes, [In] byte cBitsPixel, [In][Const] byte* lpbANDbits, [In][Const] byte* lpbXORbits); ``` But the last two parameters should be arrays.
usab
createicon missing nativearray attribute on byte array parameters currently the metadata has cs public unsafe static extern hicon createicon hinstance hinstance int nwidth int nheight byte cplanes byte cbitspixel byte lpbandbits byte lpbxorbits but the last two parameters should be arrays
1
8,109
11,300,957,290
IssuesEvent
2020-01-17 14:40:55
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
too many appressorium terms
multi-species process
GO:0075039 establishment of turgor in appressorium 1 annotations GO:0075021 cAMP-mediated activation of appressorium formation GO:0075016 appressorium formation on or near host 1 annotations GO:0075003 adhesion of symbiont appressorium to host GO:0075022 ethylene-mediated activation of appressorium formation GO:0075023 MAPK-mediated regulation of appressorium formation GO:0075040 regulation of establishment of turgor in appressorium GO:0075043 maintenance of turgor in appressorium by melanization GO:0075035 maturation of appressorium on or near host 1 annotations GO:0075024 phospholipase C-mediated activation of appressorium formation GO:0075025 initiation of appressorium on or near host GO:0075020 calcium or calmodulin-mediated activation of appressorium formation GO:0075017 regulation of appressorium formation on or near host GO:0075041 positive regulation of establishment of turgor in appressorium GO:0075042 negative regulation of establishment of turgor in appressorium I am annotating a MAP kinease which regulates appressorium formation however i don't want to use GO:0075023 MAPK-mediated regulation of appressorium formation I want to do just regulation of appressorium formation I think the other terms GO:0075024 phospholipase C-mediated activation of appressorium formation GO:0075020 calcium or calmodulin-mediated activation of appressorium formation should also go because they are just other MFs in the signalling pathway. also merge GO:0075025 initiation of appressorium on or near host into positive regulation
1.0
too many appressorium terms - GO:0075039 establishment of turgor in appressorium 1 annotations GO:0075021 cAMP-mediated activation of appressorium formation GO:0075016 appressorium formation on or near host 1 annotations GO:0075003 adhesion of symbiont appressorium to host GO:0075022 ethylene-mediated activation of appressorium formation GO:0075023 MAPK-mediated regulation of appressorium formation GO:0075040 regulation of establishment of turgor in appressorium GO:0075043 maintenance of turgor in appressorium by melanization GO:0075035 maturation of appressorium on or near host 1 annotations GO:0075024 phospholipase C-mediated activation of appressorium formation GO:0075025 initiation of appressorium on or near host GO:0075020 calcium or calmodulin-mediated activation of appressorium formation GO:0075017 regulation of appressorium formation on or near host GO:0075041 positive regulation of establishment of turgor in appressorium GO:0075042 negative regulation of establishment of turgor in appressorium I am annotating a MAP kinease which regulates appressorium formation however i don't want to use GO:0075023 MAPK-mediated regulation of appressorium formation I want to do just regulation of appressorium formation I think the other terms GO:0075024 phospholipase C-mediated activation of appressorium formation GO:0075020 calcium or calmodulin-mediated activation of appressorium formation should also go because they are just other MFs in the signalling pathway. also merge GO:0075025 initiation of appressorium on or near host into positive regulation
non_usab
too many appressorium terms go establishment of turgor in appressorium annotations go camp mediated activation of appressorium formation go appressorium formation on or near host annotations go adhesion of symbiont appressorium to host go ethylene mediated activation of appressorium formation go mapk mediated regulation of appressorium formation go regulation of establishment of turgor in appressorium go maintenance of turgor in appressorium by melanization go maturation of appressorium on or near host annotations go phospholipase c mediated activation of appressorium formation go initiation of appressorium on or near host go calcium or calmodulin mediated activation of appressorium formation go regulation of appressorium formation on or near host go positive regulation of establishment of turgor in appressorium go negative regulation of establishment of turgor in appressorium i am annotating a map kinease which regulates appressorium formation however i don t want to use go mapk mediated regulation of appressorium formation i want to do just regulation of appressorium formation i think the other terms go phospholipase c mediated activation of appressorium formation go calcium or calmodulin mediated activation of appressorium formation should also go because they are just other mfs in the signalling pathway also merge go initiation of appressorium on or near host into positive regulation
0
13,663
8,634,730,670
IssuesEvent
2018-11-22 18:06:00
st-tu-dresden/inloop
https://api.github.com/repos/st-tu-dresden/inloop
closed
Display console output and exception messages in a user-friendlier way
enhancement usability
The most annoying problem is that long exception messages do not wrap around properly, which makes it difficult to see important message details, e.g., the expected vs. actual value. ![ohne titel](https://user-images.githubusercontent.com/1561982/43329240-256e7c72-91c0-11e8-84e0-ef7cb3555d3f.png)
True
Display console output and exception messages in a user-friendlier way - The most annoying problem is that long exception messages do not wrap around properly, which makes it difficult to see important message details, e.g., the expected vs. actual value. ![ohne titel](https://user-images.githubusercontent.com/1561982/43329240-256e7c72-91c0-11e8-84e0-ef7cb3555d3f.png)
usab
display console output and exception messages in a user friendlier way the most annoying problem is that long exception messages do not wrap around properly which makes it difficult to see important message details e g the expected vs actual value
1
159,522
12,478,578,452
IssuesEvent
2020-05-29 16:42:55
spel-uchile/SUCHAI-Flight-Software
https://api.github.com/repos/spel-uchile/SUCHAI-Flight-Software
closed
Sequence of 5 commands throws exit code -11
Fuzz-Testing bug
The sequence is: tm_parse_status 168850784532074374035732101058055169116 12826105913956939644 E0@2Ni*@ -162817515606114395225612514469063087299 -48916261179497125227303848440736194118 -6386859017823808887 r@F5uu -8152784359493193565 gssb_set_burn_config -285600039855062482800991264458719794343 -3143019222374379877 10778628039574652351 -219709885077994447318495468478095151127 FZG -322088856487707042825536300309208787826 w drp_set_deployed fp_del_cmd_unix ^TyCU& -243018170100445935 "4n3Kr/hr 80211873877148678249755549121640936724 1175049179 com_get_config h -7060649124262353519 2063750755
1.0
Sequence of 5 commands throws exit code -11 - The sequence is: tm_parse_status 168850784532074374035732101058055169116 12826105913956939644 E0@2Ni*@ -162817515606114395225612514469063087299 -48916261179497125227303848440736194118 -6386859017823808887 r@F5uu -8152784359493193565 gssb_set_burn_config -285600039855062482800991264458719794343 -3143019222374379877 10778628039574652351 -219709885077994447318495468478095151127 FZG -322088856487707042825536300309208787826 w drp_set_deployed fp_del_cmd_unix ^TyCU& -243018170100445935 "4n3Kr/hr 80211873877148678249755549121640936724 1175049179 com_get_config h -7060649124262353519 2063750755
non_usab
sequence of commands throws exit code the sequence is tm parse status r gssb set burn config fzg w drp set deployed fp del cmd unix tycu hr com get config h
0
103,755
4,184,965,399
IssuesEvent
2016-06-23 09:17:05
Zenika/Zenika-Resume
https://api.github.com/repos/Zenika/Zenika-Resume
closed
Avoir une variable pour dรฉfinir la pรฉriode sur une expรฉrience
Priority 1
Dans une expรฉrience il devrait รชtre possible de dรฉfinir la pรฉriode : Janvier 2015 - Mars 2016 Juste une string pour laisser le plus de liberter
1.0
Avoir une variable pour dรฉfinir la pรฉriode sur une expรฉrience - Dans une expรฉrience il devrait รชtre possible de dรฉfinir la pรฉriode : Janvier 2015 - Mars 2016 Juste une string pour laisser le plus de liberter
non_usab
avoir une variable pour dรฉfinir la pรฉriode sur une expรฉrience dans une expรฉrience il devrait รชtre possible de dรฉfinir la pรฉriode janvier mars juste une string pour laisser le plus de liberter
0
177,125
21,464,567,367
IssuesEvent
2022-04-26 01:23:15
raindigi/site-landing
https://api.github.com/repos/raindigi/site-landing
closed
CVE-2021-33623 (High) detected in trim-newlines-1.0.0.tgz - autoclosed
security vulnerability
## CVE-2021-33623 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>trim-newlines-1.0.0.tgz</b></p></summary> <p>Trim newlines from the start and/or end of a string</p> <p>Library home page: <a href="https://registry.npmjs.org/trim-newlines/-/trim-newlines-1.0.0.tgz">https://registry.npmjs.org/trim-newlines/-/trim-newlines-1.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/trim-newlines/package.json</p> <p> Dependency Hierarchy: - gatsby-plugin-sharp-2.0.32.tgz (Root Library) - imagemin-mozjpeg-8.0.0.tgz - mozjpeg-6.0.1.tgz - logalot-2.1.0.tgz - squeak-1.3.0.tgz - lpad-align-1.1.2.tgz - meow-3.7.0.tgz - :x: **trim-newlines-1.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/raindigi/site-landing/commit/bcba8b01c6ab60dc16fa75543eba31b8be7a461e">bcba8b01c6ab60dc16fa75543eba31b8be7a461e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The trim-newlines package before 3.0.1 and 4.x before 4.0.1 for Node.js has an issue related to regular expression denial-of-service (ReDoS) for the .end() method. <p>Publish Date: 2021-05-28 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33623>CVE-2021-33623</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33623">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33623</a></p> <p>Release Date: 2021-05-28</p> <p>Fix Resolution: trim-newlines - 3.0.1, 4.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-33623 (High) detected in trim-newlines-1.0.0.tgz - autoclosed - ## CVE-2021-33623 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>trim-newlines-1.0.0.tgz</b></p></summary> <p>Trim newlines from the start and/or end of a string</p> <p>Library home page: <a href="https://registry.npmjs.org/trim-newlines/-/trim-newlines-1.0.0.tgz">https://registry.npmjs.org/trim-newlines/-/trim-newlines-1.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/trim-newlines/package.json</p> <p> Dependency Hierarchy: - gatsby-plugin-sharp-2.0.32.tgz (Root Library) - imagemin-mozjpeg-8.0.0.tgz - mozjpeg-6.0.1.tgz - logalot-2.1.0.tgz - squeak-1.3.0.tgz - lpad-align-1.1.2.tgz - meow-3.7.0.tgz - :x: **trim-newlines-1.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/raindigi/site-landing/commit/bcba8b01c6ab60dc16fa75543eba31b8be7a461e">bcba8b01c6ab60dc16fa75543eba31b8be7a461e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The trim-newlines package before 3.0.1 and 4.x before 4.0.1 for Node.js has an issue related to regular expression denial-of-service (ReDoS) for the .end() method. <p>Publish Date: 2021-05-28 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33623>CVE-2021-33623</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33623">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33623</a></p> <p>Release Date: 2021-05-28</p> <p>Fix Resolution: trim-newlines - 3.0.1, 4.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_usab
cve high detected in trim newlines tgz autoclosed cve high severity vulnerability vulnerable library trim newlines tgz trim newlines from the start and or end of a string library home page a href path to dependency file package json path to vulnerable library node modules trim newlines package json dependency hierarchy gatsby plugin sharp tgz root library imagemin mozjpeg tgz mozjpeg tgz logalot tgz squeak tgz lpad align tgz meow tgz x trim newlines tgz vulnerable library found in head commit a href vulnerability details the trim newlines package before and x before for node js has an issue related to regular expression denial of service redos for the end method publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution trim newlines step up your open source security game with whitesource
0
11,495
7,266,845,497
IssuesEvent
2018-02-20 00:40:38
phan/phan
https://api.github.com/repos/phan/phan
closed
Finish support for Windows in daemon mode
daemon enhancement usability
2 things look promising: 1. docker (may need to add config flag to change TCP listen interface to a custom value such as 0.0.0.0) 2. (Windows 10): "Ubuntu on Windows" https://www.hanselman.com/blog/DevelopersCanRunBashShellAndUsermodeUbuntuLinuxBinariesOnWindows10.aspx (Best performance) I'm having problems getting the VM set up for https://developer.microsoft.com/en-us/windows/downloads/virtual-machines on virtualbox (May be possibly by installing php7.1 and php7.1-dev with pthread support (pcntl module), building php-ast, and running phan.)
True
Finish support for Windows in daemon mode - 2 things look promising: 1. docker (may need to add config flag to change TCP listen interface to a custom value such as 0.0.0.0) 2. (Windows 10): "Ubuntu on Windows" https://www.hanselman.com/blog/DevelopersCanRunBashShellAndUsermodeUbuntuLinuxBinariesOnWindows10.aspx (Best performance) I'm having problems getting the VM set up for https://developer.microsoft.com/en-us/windows/downloads/virtual-machines on virtualbox (May be possibly by installing php7.1 and php7.1-dev with pthread support (pcntl module), building php-ast, and running phan.)
usab
finish support for windows in daemon mode things look promising docker may need to add config flag to change tcp listen interface to a custom value such as windows ubuntu on windows best performance i m having problems getting the vm set up for on virtualbox may be possibly by installing and dev with pthread support pcntl module building php ast and running phan
1
27,964
30,794,854,986
IssuesEvent
2023-07-31 19:01:05
aws-amplify/amplify-hosting
https://api.github.com/repos/aws-amplify/amplify-hosting
closed
UNCOMPRESSED_CODE_SIZE_EXCEEDED (frequent issue)
usability response-requested closed-for-staleness compute
### Before opening, please confirm: - [X] I have checked to see if my question is addressed in the [FAQ](https://github.com/aws-amplify/amplify-hosting/blob/master/FAQ.md). - [X] I have [searched for duplicate or closed issues](https://github.com/aws-amplify/amplify-hosting/issues?q=is%3Aissue+). - [X] I have read the guide for [submitting bug reports](https://github.com/aws-amplify/amplify-hosting/blob/master/CONTRIBUTING.md). - [X] I have done my best to include a minimal, self-contained set of instructions for consistently reproducing the issue. - [X] I have removed any sensitive information from my code snippets and submission. ### App Id d1nra4pyobb96k ### AWS Region us-east-1 ### Amplify Hosting feature Frontend builds ### Frontend framework Next.js ### Next.js version 13.4.7 ### Next.js router Pages Router ### Describe the bug Frequently builds fail to deploy because they exceed the 120 MB limit. Getting under this limit with few dependencies is virtually impossible. Is it possible to somehow increase this arbitrary storage limit? We've also tried a few of the suggestions to compress the file size, to no avail. ### Expected behavior Builds should deploy ### Reproduction steps 1. Deploy an application that goes slightly over the 120mb limit 2. The build fails to deploy ### Build Settings ```yaml version: 1 frontend: phases: preBuild: commands: - yarn install --frozen-lockfile # - yarn install build: commands: - npm run build # - rm -rf .next/standalone/node_modules/@buidlerlabs/hashgraph-venin-js/node_modules/solc - du -h .next/standalone #- allfiles=$(ls -al ./.next/standalone/**/*.js) #- npx esbuild $allfiles --minify --outdir=.next/standalone --platform=node --target=node16 --format=cjs --allow-overwrite artifacts: baseDirectory: .next files: - '**/*' cache: paths: - node_modules/**/* ``` ### Log output <details> ``` # Put your logs below this line ``` </details> ### Additional information _No response_
True
UNCOMPRESSED_CODE_SIZE_EXCEEDED (frequent issue) - ### Before opening, please confirm: - [X] I have checked to see if my question is addressed in the [FAQ](https://github.com/aws-amplify/amplify-hosting/blob/master/FAQ.md). - [X] I have [searched for duplicate or closed issues](https://github.com/aws-amplify/amplify-hosting/issues?q=is%3Aissue+). - [X] I have read the guide for [submitting bug reports](https://github.com/aws-amplify/amplify-hosting/blob/master/CONTRIBUTING.md). - [X] I have done my best to include a minimal, self-contained set of instructions for consistently reproducing the issue. - [X] I have removed any sensitive information from my code snippets and submission. ### App Id d1nra4pyobb96k ### AWS Region us-east-1 ### Amplify Hosting feature Frontend builds ### Frontend framework Next.js ### Next.js version 13.4.7 ### Next.js router Pages Router ### Describe the bug Frequently builds fail to deploy because they exceed the 120 MB limit. Getting under this limit with few dependencies is virtually impossible. Is it possible to somehow increase this arbitrary storage limit? We've also tried a few of the suggestions to compress the file size, to no avail. ### Expected behavior Builds should deploy ### Reproduction steps 1. Deploy an application that goes slightly over the 120mb limit 2. The build fails to deploy ### Build Settings ```yaml version: 1 frontend: phases: preBuild: commands: - yarn install --frozen-lockfile # - yarn install build: commands: - npm run build # - rm -rf .next/standalone/node_modules/@buidlerlabs/hashgraph-venin-js/node_modules/solc - du -h .next/standalone #- allfiles=$(ls -al ./.next/standalone/**/*.js) #- npx esbuild $allfiles --minify --outdir=.next/standalone --platform=node --target=node16 --format=cjs --allow-overwrite artifacts: baseDirectory: .next files: - '**/*' cache: paths: - node_modules/**/* ``` ### Log output <details> ``` # Put your logs below this line ``` </details> ### Additional information _No response_
usab
uncompressed code size exceeded frequent issue before opening please confirm i have checked to see if my question is addressed in the i have i have read the guide for i have done my best to include a minimal self contained set of instructions for consistently reproducing the issue i have removed any sensitive information from my code snippets and submission app id aws region us east amplify hosting feature frontend builds frontend framework next js next js version next js router pages router describe the bug frequently builds fail to deploy because they exceed the mb limit getting under this limit with few dependencies is virtually impossible is it possible to somehow increase this arbitrary storage limit we ve also tried a few of the suggestions to compress the file size to no avail expected behavior builds should deploy reproduction steps deploy an application that goes slightly over the limit the build fails to deploy build settings yaml version frontend phases prebuild commands yarn install frozen lockfile yarn install build commands npm run build rm rf next standalone node modules buidlerlabs hashgraph venin js node modules solc du h next standalone allfiles ls al next standalone js npx esbuild allfiles minify outdir next standalone platform node target format cjs allow overwrite artifacts basedirectory next files cache paths node modules log output put your logs below this line additional information no response
1
14,509
10,904,917,019
IssuesEvent
2019-11-20 09:43:22
celo-org/celo-blockchain
https://api.github.com/repos/celo-org/celo-blockchain
opened
Alfajores node never syncs
bug celo-blockchain infrastructure
### Expected Behavior Running an alfajores full node and should be able to sync the blockchain ### Current Behavior Alfajores nodes do not sync Integration nodes sync fine ### Steps to reproduce I am running celo nodes with the scripts from this PR: https://github.com/celo-org/celo-blockchain/pull/613/files like `$ ./run-node-docker.sh alajores 44782 us.gcr.io/celo-testnet/celo-node:alfajores 8547 8548 30304` It is always ending up with a log of lots of handshakes but never every retrieving any chain data. It happens with the docker image from google cloud and custom built docker image from `celo-blockchain` master branch. It looks like that it is not able to reach the boot nodes. Full log here: ``` sebastian:celo-blockchain/ (feature/node_runnerโœ—) $ ./run-node-docker.sh alajores 44782 us.gcr.io/celo-testnet/celo-node:alfajores 8547 8548 30304 Using Celo address: 9bbc4fe69f24c70527bd083e78081ba0e2bf3984 INFO [11-20|09:33:47.480] Maximum peer count ETH=25 LES=99 total=124 INFO [11-20|09:33:47.487] Allocated cache and file handles database=/root/.celo/geth/chaindata cache=16 handles=16 INFO [11-20|09:33:47.564] Writing custom genesis block INFO [11-20|09:33:47.571] Persisted trie from memory database nodes=103 size=17.78kB time=2.1954ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.581] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.581] Successfully wrote genesis state database=chaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.581] Allocated cache and file handles database=/root/.celo/geth/lightchaindata cache=16 handles=16 INFO [11-20|09:33:47.604] Writing custom genesis block INFO [11-20|09:33:47.610] Persisted trie from memory database nodes=103 size=17.78kB time=1.2841ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.615] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.615] Successfully wrote genesis state database=lightchaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.615] Allocated cache and file handles database=/root/.celo/geth/ultralightchaindata cache=16 handles=16 INFO [11-20|09:33:47.642] Writing custom genesis block INFO [11-20|09:33:47.646] Persisted trie from memory database nodes=103 size=17.78kB time=1.3452ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.653] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.653] Successfully wrote genesis state database=ultralightchaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:50.471] Maximum peer count ETH=100 LES=1000 total=1100 WARN [11-20|09:33:50.491] Found deprecated node list file /root/.celo/static-nodes.json, please use the TOML config file instead. INFO [11-20|09:33:50.501] Starting peer-to-peer node instance=Geth/v1.8.23-stable/linux-amd64/go1.11.13 INFO [11-20|09:33:50.502] Allocated cache and file handles database=/root/.celo/geth/chaindata cache=768 handles=524288 INFO [11-20|09:33:50.714] Initialised chain configuration config="{ChainID: 44785 Homestead: 0 DAO: <nil> DAOSupport: false EIP150: 0 EIP155: 0 EIP158: 0 Byzantium: 0 Constantinople: 0 ConstantinopleFix: 0 Engine: istanbul}" INFO [11-20|09:33:50.715] Creating dir dir=/root/.celo/geth/istanbul INFO [11-20|09:33:50.716] Initialising Ethereum protocol versions=[64] network=44782 INFO [11-20|09:33:50.720] Loaded most recent local header number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.720] Loaded most recent local full block number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.720] Loaded most recent local fast block number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.724] Regenerated local transaction journal transactions=0 accounts=0 INFO [11-20|09:33:50.766] UDP listener up net=enode://a9b0789b986d70e51c9aec376d28847fed05ee909e3d5ccf5cef57d57549bba938ab85c243ba83f6b9db331adee27f849aff27eaa40ae8b4ef98584da299a7b1@[::]:30303 INFO [11-20|09:33:50.772] New local node record seq=1 id=f82efd32270286a9 ip=127.0.0.1 udp=30303 tcp=30303 INFO [11-20|09:33:50.772] Started P2P networking self=enode://a9b0789b986d70e51c9aec376d28847fed05ee909e3d5ccf5cef57d57549bba938ab85c243ba83f6b9db331adee27f849aff27eaa40ae8b4ef98584da299a7b1@127.0.0.1:30303 INFO [11-20|09:33:50.777] Starting topic registration topic=LES2@92b2a5bbc9b1b519 INFO [11-20|09:33:50.780] IPC endpoint opened url=/root/.celo/geth.ipc INFO [11-20|09:33:50.781] HTTP endpoint opened url=http://0.0.0.0:8545 cors= vhosts=localhost INFO [11-20|09:33:51.098] Ethereum handshake HASH id=9da22b8e5c6122d2 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.106] Ethereum handshake HASH id=947523bf1d9b8976 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.111] Ethereum handshake HASH id=df431a8d3e701951 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.115] Ethereum handshake HASH id=069c4c46b5929ece conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.116] Ethereum handshake HASH id=e5264af1f57e3026 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.119] Ethereum handshake HASH id=2ef4d3aec48f17d3 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.123] Ethereum handshake HASH id=bf12068ddde6912f conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.125] Ethereum handshake HASH id=d940b682c3bdf911 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.126] Ethereum handshake HASH id=2392f7a01f53abd8 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.128] Ethereum handshake HASH id=92f89d75a3379528 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.070] Ethereum handshake HASH id=92f89d75a3379528 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.082] Ethereum handshake HASH id=9da22b8e5c6122d2 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.093] Ethereum handshake HASH id=df431a8d3e701951 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.096] Ethereum handshake HASH id=d940b682c3bdf911 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.099] Ethereum handshake HASH id=069c4c46b5929ece conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.103] Ethereum handshake HASH id=bf12068ddde6912f conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.107] Ethereum handshake HASH id=2392f7a01f53abd8 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.109] Ethereum handshake HASH id=e5264af1f57e3026 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.110] Ethereum handshake HASH id=947523bf1d9b8976 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.112] Ethereum handshake HASH id=2ef4d3aec48f17d3 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" ```
1.0
Alfajores node never syncs - ### Expected Behavior Running an alfajores full node and should be able to sync the blockchain ### Current Behavior Alfajores nodes do not sync Integration nodes sync fine ### Steps to reproduce I am running celo nodes with the scripts from this PR: https://github.com/celo-org/celo-blockchain/pull/613/files like `$ ./run-node-docker.sh alajores 44782 us.gcr.io/celo-testnet/celo-node:alfajores 8547 8548 30304` It is always ending up with a log of lots of handshakes but never every retrieving any chain data. It happens with the docker image from google cloud and custom built docker image from `celo-blockchain` master branch. It looks like that it is not able to reach the boot nodes. Full log here: ``` sebastian:celo-blockchain/ (feature/node_runnerโœ—) $ ./run-node-docker.sh alajores 44782 us.gcr.io/celo-testnet/celo-node:alfajores 8547 8548 30304 Using Celo address: 9bbc4fe69f24c70527bd083e78081ba0e2bf3984 INFO [11-20|09:33:47.480] Maximum peer count ETH=25 LES=99 total=124 INFO [11-20|09:33:47.487] Allocated cache and file handles database=/root/.celo/geth/chaindata cache=16 handles=16 INFO [11-20|09:33:47.564] Writing custom genesis block INFO [11-20|09:33:47.571] Persisted trie from memory database nodes=103 size=17.78kB time=2.1954ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.581] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.581] Successfully wrote genesis state database=chaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.581] Allocated cache and file handles database=/root/.celo/geth/lightchaindata cache=16 handles=16 INFO [11-20|09:33:47.604] Writing custom genesis block INFO [11-20|09:33:47.610] Persisted trie from memory database nodes=103 size=17.78kB time=1.2841ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.615] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.615] Successfully wrote genesis state database=lightchaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.615] Allocated cache and file handles database=/root/.celo/geth/ultralightchaindata cache=16 handles=16 INFO [11-20|09:33:47.642] Writing custom genesis block INFO [11-20|09:33:47.646] Persisted trie from memory database nodes=103 size=17.78kB time=1.3452ms gcnodes=0 gcsize=0.00B gctime=0s livenodes=1 livesize=0.00B INFO [11-20|09:33:47.653] HASH2 hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:47.653] Successfully wrote genesis state database=ultralightchaindata hash=92b2a5โ€ฆ86b4d6 INFO [11-20|09:33:50.471] Maximum peer count ETH=100 LES=1000 total=1100 WARN [11-20|09:33:50.491] Found deprecated node list file /root/.celo/static-nodes.json, please use the TOML config file instead. INFO [11-20|09:33:50.501] Starting peer-to-peer node instance=Geth/v1.8.23-stable/linux-amd64/go1.11.13 INFO [11-20|09:33:50.502] Allocated cache and file handles database=/root/.celo/geth/chaindata cache=768 handles=524288 INFO [11-20|09:33:50.714] Initialised chain configuration config="{ChainID: 44785 Homestead: 0 DAO: <nil> DAOSupport: false EIP150: 0 EIP155: 0 EIP158: 0 Byzantium: 0 Constantinople: 0 ConstantinopleFix: 0 Engine: istanbul}" INFO [11-20|09:33:50.715] Creating dir dir=/root/.celo/geth/istanbul INFO [11-20|09:33:50.716] Initialising Ethereum protocol versions=[64] network=44782 INFO [11-20|09:33:50.720] Loaded most recent local header number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.720] Loaded most recent local full block number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.720] Loaded most recent local fast block number=0 hash=92b2a5โ€ฆ86b4d6 td=1 age=1y2mo4w INFO [11-20|09:33:50.724] Regenerated local transaction journal transactions=0 accounts=0 INFO [11-20|09:33:50.766] UDP listener up net=enode://a9b0789b986d70e51c9aec376d28847fed05ee909e3d5ccf5cef57d57549bba938ab85c243ba83f6b9db331adee27f849aff27eaa40ae8b4ef98584da299a7b1@[::]:30303 INFO [11-20|09:33:50.772] New local node record seq=1 id=f82efd32270286a9 ip=127.0.0.1 udp=30303 tcp=30303 INFO [11-20|09:33:50.772] Started P2P networking self=enode://a9b0789b986d70e51c9aec376d28847fed05ee909e3d5ccf5cef57d57549bba938ab85c243ba83f6b9db331adee27f849aff27eaa40ae8b4ef98584da299a7b1@127.0.0.1:30303 INFO [11-20|09:33:50.777] Starting topic registration topic=LES2@92b2a5bbc9b1b519 INFO [11-20|09:33:50.780] IPC endpoint opened url=/root/.celo/geth.ipc INFO [11-20|09:33:50.781] HTTP endpoint opened url=http://0.0.0.0:8545 cors= vhosts=localhost INFO [11-20|09:33:51.098] Ethereum handshake HASH id=9da22b8e5c6122d2 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.106] Ethereum handshake HASH id=947523bf1d9b8976 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.111] Ethereum handshake HASH id=df431a8d3e701951 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.115] Ethereum handshake HASH id=069c4c46b5929ece conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.116] Ethereum handshake HASH id=e5264af1f57e3026 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.119] Ethereum handshake HASH id=2ef4d3aec48f17d3 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.123] Ethereum handshake HASH id=bf12068ddde6912f conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.125] Ethereum handshake HASH id=d940b682c3bdf911 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.126] Ethereum handshake HASH id=2392f7a01f53abd8 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:33:51.128] Ethereum handshake HASH id=92f89d75a3379528 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.070] Ethereum handshake HASH id=92f89d75a3379528 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.082] Ethereum handshake HASH id=9da22b8e5c6122d2 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.093] Ethereum handshake HASH id=df431a8d3e701951 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.096] Ethereum handshake HASH id=d940b682c3bdf911 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.099] Ethereum handshake HASH id=069c4c46b5929ece conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.103] Ethereum handshake HASH id=bf12068ddde6912f conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.107] Ethereum handshake HASH id=2392f7a01f53abd8 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.109] Ethereum handshake HASH id=e5264af1f57e3026 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.110] Ethereum handshake HASH id=947523bf1d9b8976 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" INFO [11-20|09:34:23.112] Ethereum handshake HASH id=2ef4d3aec48f17d3 conn=staticdial hash=92b2a5โ€ฆ86b4d6 genesis="&{header:0xc0000fefc0 uncles:[] transactions:[] randomness:0xc0000ce700 hash:{v:[146 178 165 187 201 177 181 25 121 173 213 235 110 121 112 126 10 177 62 45 65 179 190 108 230 148 125 87 157 134 180 214]} size:{v:<nil>} td:<nil> ReceivedAt:0001-01-01 00:00:00 +0000 UTC ReceivedFrom:<nil>}" ```
non_usab
alfajores node never syncs expected behavior running an alfajores full node and should be able to sync the blockchain current behavior alfajores nodes do not sync integration nodes sync fine steps to reproduce i am running celo nodes with the scripts from this pr like run node docker sh alajores us gcr io celo testnet celo node alfajores it is always ending up with a log of lots of handshakes but never every retrieving any chain data it happens with the docker image from google cloud and custom built docker image from celo blockchain master branch it looks like that it is not able to reach the boot nodes full log here sebastian celo blockchain feature node runnerโœ— run node docker sh alajores us gcr io celo testnet celo node alfajores using celo address info maximum peer count eth les total info allocated cache and file handles database root celo geth chaindata cache handles info writing custom genesis block info persisted trie from memory database nodes size time gcnodes gcsize gctime livenodes livesize info hash โ€ฆ info successfully wrote genesis state database chaindata hash โ€ฆ info allocated cache and file handles database root celo geth lightchaindata cache handles info writing custom genesis block info persisted trie from memory database nodes size time gcnodes gcsize gctime livenodes livesize info hash โ€ฆ info successfully wrote genesis state database lightchaindata hash โ€ฆ info allocated cache and file handles database root celo geth ultralightchaindata cache handles info writing custom genesis block info persisted trie from memory database nodes size time gcnodes gcsize gctime livenodes livesize info hash โ€ฆ info successfully wrote genesis state database ultralightchaindata hash โ€ฆ info maximum peer count eth les total warn found deprecated node list file root celo static nodes json please use the toml config file instead info starting peer to peer node instance geth stable linux info allocated cache and file handles database root celo geth chaindata cache handles info initialised chain configuration config chainid homestead dao daosupport false byzantium constantinople constantinoplefix engine istanbul info creating dir dir root celo geth istanbul info initialising ethereum protocol versions network info loaded most recent local header number hash โ€ฆ td age info loaded most recent local full block number hash โ€ฆ td age info loaded most recent local fast block number hash โ€ฆ td age info regenerated local transaction journal transactions accounts info udp listener up net enode info new local node record seq id ip udp tcp info started networking self enode info starting topic registration topic info ipc endpoint opened url root celo geth ipc info http endpoint opened url cors vhosts localhost info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom info ethereum handshake hash id conn staticdial hash โ€ฆ genesis header uncles transactions randomness hash v size v td receivedat utc receivedfrom
0
8,667
5,906,605,709
IssuesEvent
2017-05-19 15:32:57
bardsoftware/ganttproject
https://api.github.com/repos/bardsoftware/ganttproject
closed
Mac: Command key toggles task edit mode
bug Fix committed to the repository OpSys-OSX UI Usability
When pressing command key on mac, ganttproject goes in edit mode as if a normal character had been pressed. Expected behaviour: do not react, same as if ctrl key was pressed.
True
Mac: Command key toggles task edit mode - When pressing command key on mac, ganttproject goes in edit mode as if a normal character had been pressed. Expected behaviour: do not react, same as if ctrl key was pressed.
usab
mac command key toggles task edit mode when pressing command key on mac ganttproject goes in edit mode as if a normal character had been pressed expected behaviour do not react same as if ctrl key was pressed
1
650,146
21,336,292,630
IssuesEvent
2022-04-18 14:57:11
ita-social-projects/horondi_client_fe
https://api.github.com/repos/ita-social-projects/horondi_client_fe
closed
[Burger Menu] Wishlist and Cart icons overlap Facebook and Instagram icons
bug UI priority: low severity: trivial
Environment: OS Windows 10 Pro (Version 10.0.19043), Google Chrome Version 98.0.4758.82 (64-bit) Reproducible: Always **Preconditions:** 1) Horondi homepage is open https://horondi-front-staging.azurewebsites.net/ **Steps to reproduce:** 1. Open burger menu 2. Click Ctrl+ to zoom in to 100% **Actual result:** Wishlist and Cart icons overlap Facebook and Instagram icons after zoom in to more than 100%. Please check attached video. **Expected result:** Wishlist and Cart icons do not overlap Facebook and Instagram icons after zoom in to more than 100%. https://user-images.githubusercontent.com/101739835/161530034-d2db9c96-c30f-48ac-aa8a-7403ab93802a.mp4
1.0
[Burger Menu] Wishlist and Cart icons overlap Facebook and Instagram icons - Environment: OS Windows 10 Pro (Version 10.0.19043), Google Chrome Version 98.0.4758.82 (64-bit) Reproducible: Always **Preconditions:** 1) Horondi homepage is open https://horondi-front-staging.azurewebsites.net/ **Steps to reproduce:** 1. Open burger menu 2. Click Ctrl+ to zoom in to 100% **Actual result:** Wishlist and Cart icons overlap Facebook and Instagram icons after zoom in to more than 100%. Please check attached video. **Expected result:** Wishlist and Cart icons do not overlap Facebook and Instagram icons after zoom in to more than 100%. https://user-images.githubusercontent.com/101739835/161530034-d2db9c96-c30f-48ac-aa8a-7403ab93802a.mp4
non_usab
wishlist and cart icons overlap facebook and instagram icons environment os windows pro version google chrome version bit reproducible always preconditions horondi homepage is open steps to reproduce open burger menu click ctrl to zoom in to actual result wishlist and cart icons overlap facebook and instagram icons after zoom in to more than please check attached video expected result wishlist and cart icons do not overlap facebook and instagram icons after zoom in to more than
0
11,012
7,016,885,960
IssuesEvent
2017-12-21 07:01:00
integration-team-iiith/colloid-and-surface-chemistry-iiith-javascript-lab
https://api.github.com/repos/integration-team-iiith/colloid-and-surface-chemistry-iiith-javascript-lab
closed
QA_Demonstration of Tyndall Effect_experiment_quiz-section-is-not-given(Fe(OH)3-solution)
bug Category : Functionality Category : Usability Developed : By VLEAD Resolved Severity : S1
Defect Description : In the original experiment section, once the Fe(OH)3 solution is mixed the user will be able to check the effect of the solution by clicking on the prepared Fe(OH)3 solution. On click, the user will see a pop up with the detailed observation of Tyndall effect of the Fe(OH)3 solution. Which is not given in the converted experiment. Actual Result : In the original experiment section, once the Fe(OH)3 solution is mixed the user will be able to check the effect of the solution by clicking on the prepared Fe(OH)3 solution. On click, the user will see a pop up with the detailed observation of Tyndall effect of the Fe(OH)3 solution. Which is not given in the converted experiment. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers:Firefox-42.0,Chrome-47.0,chromium-45.0 Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Attachments: ![qa_csc_i040](https://user-images.githubusercontent.com/13479177/33979256-905c7d02-e0c9-11e7-8848-d42dcffc42ea.png)
True
QA_Demonstration of Tyndall Effect_experiment_quiz-section-is-not-given(Fe(OH)3-solution) - Defect Description : In the original experiment section, once the Fe(OH)3 solution is mixed the user will be able to check the effect of the solution by clicking on the prepared Fe(OH)3 solution. On click, the user will see a pop up with the detailed observation of Tyndall effect of the Fe(OH)3 solution. Which is not given in the converted experiment. Actual Result : In the original experiment section, once the Fe(OH)3 solution is mixed the user will be able to check the effect of the solution by clicking on the prepared Fe(OH)3 solution. On click, the user will see a pop up with the detailed observation of Tyndall effect of the Fe(OH)3 solution. Which is not given in the converted experiment. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers:Firefox-42.0,Chrome-47.0,chromium-45.0 Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Attachments: ![qa_csc_i040](https://user-images.githubusercontent.com/13479177/33979256-905c7d02-e0c9-11e7-8848-d42dcffc42ea.png)
usab
qa demonstration of tyndall effect experiment quiz section is not given fe oh solution defect description in the original experiment section once the fe oh solution is mixed the user will be able to check the effect of the solution by clicking on the prepared fe oh solution on click the user will see a pop up with the detailed observation of tyndall effect of the fe oh solution which is not given in the converted experiment actual result in the original experiment section once the fe oh solution is mixed the user will be able to check the effect of the solution by clicking on the prepared fe oh solution on click the user will see a pop up with the detailed observation of tyndall effect of the fe oh solution which is not given in the converted experiment environment os windows ubuntu centos browsers firefox chrome chromium bandwidth hardware configuration processor attachments
1
4,547
3,871,725,015
IssuesEvent
2016-04-11 11:02:53
lionheart/openradar-mirror
https://api.github.com/repos/lionheart/openradar-mirror
opened
22670215: Xcode does not allow submitting iOS apps to TestFlight with beta versions
classification:ui/usability reproducible:always status:open
#### Description Summary: Only builds archived with GM versions of Xcode are allowed to be submitted to TestFlight, but this is extremely limiting. If TestFlight wants to establish itself as the de facto build system, it needs to become more flexible. I refuse to use other systems (like Fabric, HockeyApp, etc), because A: I have to deal with device UUIDs, and B: running the app through that is not production environment, so it makes it harder to test things like IAP, Push Notifications, etc. Not to mention that you canโ€™t install Watch apps, and probably the same for Apple TV apps. If we, as developers, want to release quality and bug-free apps, we need to be able to test them from the early stages, not just during the very last week. Steps to Reproduce: - Create an iOS project with the latest Xcode Beta (7.1 Beta at the time of writing this bug). - Try to submit to TestFlight for Internal and External testing. Expected Results: - Build is accepted and can be used for Internal and External testing, though of course it canโ€™t be submitted for review yet. Actual Results: - Build is rejected (see screenshot). Notes: In particular, Iโ€™m unable to archive apps using Xcode 7 GM right now (radars 22392501 and 22183332). Those are fixed in 7.1, and Iโ€™m already using this version to build my app, but not being able to deploy to TestFlight until who knows when is very frustrating. - Product Version: All Created: 2015-09-11T23:21:47.169590 Originated: 2015-09-11T16:21:00 Open Radar Link: http://www.openradar.me/22670215
True
22670215: Xcode does not allow submitting iOS apps to TestFlight with beta versions - #### Description Summary: Only builds archived with GM versions of Xcode are allowed to be submitted to TestFlight, but this is extremely limiting. If TestFlight wants to establish itself as the de facto build system, it needs to become more flexible. I refuse to use other systems (like Fabric, HockeyApp, etc), because A: I have to deal with device UUIDs, and B: running the app through that is not production environment, so it makes it harder to test things like IAP, Push Notifications, etc. Not to mention that you canโ€™t install Watch apps, and probably the same for Apple TV apps. If we, as developers, want to release quality and bug-free apps, we need to be able to test them from the early stages, not just during the very last week. Steps to Reproduce: - Create an iOS project with the latest Xcode Beta (7.1 Beta at the time of writing this bug). - Try to submit to TestFlight for Internal and External testing. Expected Results: - Build is accepted and can be used for Internal and External testing, though of course it canโ€™t be submitted for review yet. Actual Results: - Build is rejected (see screenshot). Notes: In particular, Iโ€™m unable to archive apps using Xcode 7 GM right now (radars 22392501 and 22183332). Those are fixed in 7.1, and Iโ€™m already using this version to build my app, but not being able to deploy to TestFlight until who knows when is very frustrating. - Product Version: All Created: 2015-09-11T23:21:47.169590 Originated: 2015-09-11T16:21:00 Open Radar Link: http://www.openradar.me/22670215
usab
xcode does not allow submitting ios apps to testflight with beta versions description summary only builds archived with gm versions of xcode are allowed to be submitted to testflight but this is extremely limiting if testflight wants to establish itself as the de facto build system it needs to become more flexible i refuse to use other systems like fabric hockeyapp etc because a i have to deal with device uuids and b running the app through that is not production environment so it makes it harder to test things like iap push notifications etc not to mention that you canโ€™t install watch apps and probably the same for apple tv apps if we as developers want to release quality and bug free apps we need to be able to test them from the early stages not just during the very last week steps to reproduce create an ios project with the latest xcode beta beta at the time of writing this bug try to submit to testflight for internal and external testing expected results build is accepted and can be used for internal and external testing though of course it canโ€™t be submitted for review yet actual results build is rejected see screenshot notes in particular iโ€™m unable to archive apps using xcode gm right now radars and those are fixed in and iโ€™m already using this version to build my app but not being able to deploy to testflight until who knows when is very frustrating product version all created originated open radar link
1
33,546
6,218,692,502
IssuesEvent
2017-07-09 05:08:45
Moya/Moya
https://api.github.com/repos/Moya/Moya
closed
Update ReactiveCocoa documentation to ReactiveSwift
documentation starter task
The documentation still refers to `ReactiveSwift` as `ReactiveCocoa` in both the examples' [README](https://github.com/Moya/Moya/blob/master/docs/Examples/README.md) as well as the [documentation itself](https://github.com/Moya/Moya/blob/master/docs/Examples/ReactiveCocoa.md) (including the document's name). This should be updated to use the `ReactiveSwift` name. Make sure to also fix all references to these documents - it seems like the only reference to it is in the README: https://github.com/Moya/Moya/search?l=Markdown&q=reactivecocoa.md&type=&utf8=โœ“.
1.0
Update ReactiveCocoa documentation to ReactiveSwift - The documentation still refers to `ReactiveSwift` as `ReactiveCocoa` in both the examples' [README](https://github.com/Moya/Moya/blob/master/docs/Examples/README.md) as well as the [documentation itself](https://github.com/Moya/Moya/blob/master/docs/Examples/ReactiveCocoa.md) (including the document's name). This should be updated to use the `ReactiveSwift` name. Make sure to also fix all references to these documents - it seems like the only reference to it is in the README: https://github.com/Moya/Moya/search?l=Markdown&q=reactivecocoa.md&type=&utf8=โœ“.
non_usab
update reactivecocoa documentation to reactiveswift the documentation still refers to reactiveswift as reactivecocoa in both the examples as well as the including the document s name this should be updated to use the reactiveswift name make sure to also fix all references to these documents it seems like the only reference to it is in the readme
0
37,911
15,394,622,679
IssuesEvent
2021-03-03 18:08:40
cityofaustin/atd-data-tech
https://api.github.com/repos/cityofaustin/atd-data-tech
opened
User Management | Backfill existing user roles in staging and production
Impact: 2-Major Product: Moped Project: Moped v1.0 Service: Dev
In https://github.com/cityofaustin/atd-data-tech/issues/4994, we added user roles to the DB so we can edit them through the Moped UI. Since existing users were created before this change, we need to backfill them in the DB with the new changes to the DB, API, and UI. - [ ] Check staging users and set roles - [ ] Check production users and set roles
1.0
User Management | Backfill existing user roles in staging and production - In https://github.com/cityofaustin/atd-data-tech/issues/4994, we added user roles to the DB so we can edit them through the Moped UI. Since existing users were created before this change, we need to backfill them in the DB with the new changes to the DB, API, and UI. - [ ] Check staging users and set roles - [ ] Check production users and set roles
non_usab
user management backfill existing user roles in staging and production in we added user roles to the db so we can edit them through the moped ui since existing users were created before this change we need to backfill them in the db with the new changes to the db api and ui check staging users and set roles check production users and set roles
0
20,276
15,214,546,935
IssuesEvent
2021-02-17 13:21:28
tremor-rs/tremor-runtime
https://api.github.com/repos/tremor-rs/tremor-runtime
opened
No hygenic errors in `tremor server run`
bug cli usability
**Problem** when providing a pipeline to `tremor server run` that failed to be parsed or compiled we do not get a hygenic error. Instead, we get a generic error that describes what is wrong but not where or even in which file. ``` [2021-02-17T13:18:38Z ERROR tremor::server] error: Found the token `` but expected one of `!`, `\``, `$`, `(`, `+`, `-`, `<<`, `<ident>`, `[`, `absent`, `args`, `bool`, `event`, `float`, `for`, `group`, `heredoc`, `int`, `match`, `merge`, `nil`, `not`, `patch`, `present`, `recur`, `state`, `window`, `{` error: Found the token `` but expected one of `!`, `\``, `$`, `(`, `+`, `-`, `<<`, `<ident>`, `[`, `absent`, `args`, `bool`, `event`, `float`, `for`, `group`, `heredoc`, `int`, `match`, `merge`, `nil`, `not`, `patch`, `present`, `recur`, `state`, `window`, `{` ``` **Steps** 1. create a bad file (i.e. `bad.trickle`: `select`) 2. execute the file `tremor server run -f bad.trickle` 3. observe the output **Possible Solution(s)** Add hygenic errors to server run file loading **Notes** Output of `rustup --version`: Output of `rustup show`: Output of `tremor --version`:
True
No hygenic errors in `tremor server run` - **Problem** when providing a pipeline to `tremor server run` that failed to be parsed or compiled we do not get a hygenic error. Instead, we get a generic error that describes what is wrong but not where or even in which file. ``` [2021-02-17T13:18:38Z ERROR tremor::server] error: Found the token `` but expected one of `!`, `\``, `$`, `(`, `+`, `-`, `<<`, `<ident>`, `[`, `absent`, `args`, `bool`, `event`, `float`, `for`, `group`, `heredoc`, `int`, `match`, `merge`, `nil`, `not`, `patch`, `present`, `recur`, `state`, `window`, `{` error: Found the token `` but expected one of `!`, `\``, `$`, `(`, `+`, `-`, `<<`, `<ident>`, `[`, `absent`, `args`, `bool`, `event`, `float`, `for`, `group`, `heredoc`, `int`, `match`, `merge`, `nil`, `not`, `patch`, `present`, `recur`, `state`, `window`, `{` ``` **Steps** 1. create a bad file (i.e. `bad.trickle`: `select`) 2. execute the file `tremor server run -f bad.trickle` 3. observe the output **Possible Solution(s)** Add hygenic errors to server run file loading **Notes** Output of `rustup --version`: Output of `rustup show`: Output of `tremor --version`:
usab
no hygenic errors in tremor server run problem when providing a pipeline to tremor server run that failed to be parsed or compiled we do not get a hygenic error instead we get a generic error that describes what is wrong but not where or even in which file error found the token but expected one of absent args bool event float for group heredoc int match merge nil not patch present recur state window error found the token but expected one of absent args bool event float for group heredoc int match merge nil not patch present recur state window steps create a bad file i e bad trickle select execute the file tremor server run f bad trickle observe the output possible solution s add hygenic errors to server run file loading notes output of rustup version output of rustup show output of tremor version
1
14,622
9,368,196,333
IssuesEvent
2019-04-03 08:06:59
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
autotile editor- ability to mark squares as filled shapes/ autofill squares with a single click
feature proposal topic:editor usability
Right now setting collusion,occlusion and navigation is extremely cumbersome, because you have to manually draw the shapes. It would save ALOT of time and effort if there is a quick command that simply fills the selected square with a square shape - instead of asking to manually create each single shape 99% of the time that is what we want anyway See how rpg maker has it: ![image_tileset](https://user-images.githubusercontent.com/6495061/33233271-136a273a-d20b-11e7-9e76-0c570d53dcb1.jpg) It takes one single click to mark a square as passable or non passable. We can also click and drag to set multiple squares as non passable and right click and drag- to set multiple squares as non passable. Can we get a tool that emulates that somehow? Autoshape fill tool?
True
autotile editor- ability to mark squares as filled shapes/ autofill squares with a single click - Right now setting collusion,occlusion and navigation is extremely cumbersome, because you have to manually draw the shapes. It would save ALOT of time and effort if there is a quick command that simply fills the selected square with a square shape - instead of asking to manually create each single shape 99% of the time that is what we want anyway See how rpg maker has it: ![image_tileset](https://user-images.githubusercontent.com/6495061/33233271-136a273a-d20b-11e7-9e76-0c570d53dcb1.jpg) It takes one single click to mark a square as passable or non passable. We can also click and drag to set multiple squares as non passable and right click and drag- to set multiple squares as non passable. Can we get a tool that emulates that somehow? Autoshape fill tool?
usab
autotile editor ability to mark squares as filled shapes autofill squares with a single click right now setting collusion occlusion and navigation is extremely cumbersome because you have to manually draw the shapes it would save alot of time and effort if there is a quick command that simply fills the selected square with a square shape instead of asking to manually create each single shape of the time that is what we want anyway see how rpg maker has it it takes one single click to mark a square as passable or non passable we can also click and drag to set multiple squares as non passable and right click and drag to set multiple squares as non passable can we get a tool that emulates that somehow autoshape fill tool
1
143,520
13,065,961,268
IssuesEvent
2020-07-30 20:44:19
gatsbyjs/gatsby
https://api.github.com/repos/gatsbyjs/gatsby
closed
[docs][guides] Other CSS Frameworks and Libraries page
not stale status: community assigned type: documentation
## Summary Add a new Reference Guide to the [CSS Libraries and Frameworks](https://www.gatsbyjs.org/docs/css-libraries-and-frameworks/) section for "Other Libraries" and mention Bootstrap, Material UI, Reach UI, Grommet, etc. We do NOT want to list every UI library, as that would be impossible to maintain. Here are the criteria for what should be listed on this page: - Works with React and Gatsby - Impacts styling of a Gatsby site (to fit in the parent section) - Commonly requested or searched in the Gatsby docs (the Gatsby team has the most insight into this part) - Considers accessibility and does not promote UI anti-patterns This initial list includes: - Grommet - Chakra UI - Material UI - Bootstrap (including how to use the optional JS code which includes jQuery) This could possibly include Reach UI, but that project intentionally leaves styling out of the framework so it might not fit in this section. ### Motivation As part of https://github.com/gatsbyjs/gatsby/issues/21278, we want to add a page to the docs that provides a short and curated list of CSS/UI libraries. We don't want to maintain individual pages for all libraries, so this is a bit of a catch-all page where we can list libraries with a link and a short paragraph about each one. ### Draft the doc - [ ] Write the doc, following the format listed in these resources: - [Overview on contributing to documentation](https://www.gatsbyjs.org/contributing/docs-contributions/) - [Docs Templates](https://www.gatsbyjs.org/contributing/docs-templates/) - [ ] Add the article to the [docs sidebar](https://github.com/gatsbyjs/gatsby/blob/master/www/src/data/sidebars/doc-links.yaml) under the Styling Your Site > CSS Libraries and Frameworks section. ### Open a pull request - [ ] Open a pull request with your work including the words "closes #21541" in the pull request description
1.0
[docs][guides] Other CSS Frameworks and Libraries page - ## Summary Add a new Reference Guide to the [CSS Libraries and Frameworks](https://www.gatsbyjs.org/docs/css-libraries-and-frameworks/) section for "Other Libraries" and mention Bootstrap, Material UI, Reach UI, Grommet, etc. We do NOT want to list every UI library, as that would be impossible to maintain. Here are the criteria for what should be listed on this page: - Works with React and Gatsby - Impacts styling of a Gatsby site (to fit in the parent section) - Commonly requested or searched in the Gatsby docs (the Gatsby team has the most insight into this part) - Considers accessibility and does not promote UI anti-patterns This initial list includes: - Grommet - Chakra UI - Material UI - Bootstrap (including how to use the optional JS code which includes jQuery) This could possibly include Reach UI, but that project intentionally leaves styling out of the framework so it might not fit in this section. ### Motivation As part of https://github.com/gatsbyjs/gatsby/issues/21278, we want to add a page to the docs that provides a short and curated list of CSS/UI libraries. We don't want to maintain individual pages for all libraries, so this is a bit of a catch-all page where we can list libraries with a link and a short paragraph about each one. ### Draft the doc - [ ] Write the doc, following the format listed in these resources: - [Overview on contributing to documentation](https://www.gatsbyjs.org/contributing/docs-contributions/) - [Docs Templates](https://www.gatsbyjs.org/contributing/docs-templates/) - [ ] Add the article to the [docs sidebar](https://github.com/gatsbyjs/gatsby/blob/master/www/src/data/sidebars/doc-links.yaml) under the Styling Your Site > CSS Libraries and Frameworks section. ### Open a pull request - [ ] Open a pull request with your work including the words "closes #21541" in the pull request description
non_usab
other css frameworks and libraries page summary add a new reference guide to the section for other libraries and mention bootstrap material ui reach ui grommet etc we do not want to list every ui library as that would be impossible to maintain here are the criteria for what should be listed on this page works with react and gatsby impacts styling of a gatsby site to fit in the parent section commonly requested or searched in the gatsby docs the gatsby team has the most insight into this part considers accessibility and does not promote ui anti patterns this initial list includes grommet chakra ui material ui bootstrap including how to use the optional js code which includes jquery this could possibly include reach ui but that project intentionally leaves styling out of the framework so it might not fit in this section motivation as part of we want to add a page to the docs that provides a short and curated list of css ui libraries we don t want to maintain individual pages for all libraries so this is a bit of a catch all page where we can list libraries with a link and a short paragraph about each one draft the doc write the doc following the format listed in these resources add the article to the under the styling your site css libraries and frameworks section open a pull request open a pull request with your work including the words closes in the pull request description
0
223,131
17,569,191,385
IssuesEvent
2021-08-14 10:12:39
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
roachtest: jepsen/g2/start-kill-2 failed
C-test-failure O-robot O-roachtest release-blocker branch-release-21.1
roachtest.jepsen/g2/start-kill-2 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3296758&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3296758&tab=artifacts#/jepsen/g2/start-kill-2) on release-21.1 @ [0d7ae9fe48bee65304c051f73c2b09b8887d0922](https://github.com/cockroachdb/cockroach/commits/0d7ae9fe48bee65304c051f73c2b09b8887d0922): ``` The test failed on branch=release-21.1, cloud=gce: test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jepsen/g2/start-kill-2/run_1 cluster.go:2254,jepsen.go:88,jepsen.go:139,jepsen.go:347,test_runner.go:733: output in run_192227.483_n1-6_sh: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3296758-1628835385-59-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned: context canceled (1) attached stack trace -- stack trace: | main.(*cluster).RunE | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2332 | main.(*cluster).Run | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2252 | main.initJepsen | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:88 | main.runJepsen | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:139 | main.registerJepsen.func1 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:347 | main.(*testRunner).runTest.func2 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:733 | runtime.goexit | /usr/local/go/src/runtime/asm_amd64.s:1371 Wraps: (2) output in run_192227.483_n1-6_sh Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3296758-1628835385-59-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned | stderr: | | stdout: | <... some data truncated by circular buffer; go to artifacts for details ...> | ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ Wraps: (4) secondary error attachment | signal: interrupt | (1) signal: interrupt | Error types: (1) *exec.ExitError Wraps: (5) context canceled Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *main.withCommandDetails (4) *secondary.withSecondaryError (5) *errors.errorString ``` <details><summary>Reproduce</summary> <p> <p>To reproduce, try: ```bash # From https://go.crdb.dev/p/roachstress, perhaps edited lightly. caffeinate ./roachstress.sh jepsen/g2/start-kill-2 ``` </p> </p> </details> /cc @cockroachdb/kv <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/g2/start-kill-2.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
roachtest: jepsen/g2/start-kill-2 failed - roachtest.jepsen/g2/start-kill-2 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3296758&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3296758&tab=artifacts#/jepsen/g2/start-kill-2) on release-21.1 @ [0d7ae9fe48bee65304c051f73c2b09b8887d0922](https://github.com/cockroachdb/cockroach/commits/0d7ae9fe48bee65304c051f73c2b09b8887d0922): ``` The test failed on branch=release-21.1, cloud=gce: test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jepsen/g2/start-kill-2/run_1 cluster.go:2254,jepsen.go:88,jepsen.go:139,jepsen.go:347,test_runner.go:733: output in run_192227.483_n1-6_sh: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3296758-1628835385-59-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned: context canceled (1) attached stack trace -- stack trace: | main.(*cluster).RunE | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2332 | main.(*cluster).Run | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2252 | main.initJepsen | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:88 | main.runJepsen | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:139 | main.registerJepsen.func1 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/jepsen.go:347 | main.(*testRunner).runTest.func2 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:733 | runtime.goexit | /usr/local/go/src/runtime/asm_amd64.s:1371 Wraps: (2) output in run_192227.483_n1-6_sh Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3296758-1628835385-59-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned | stderr: | | stdout: | <... some data truncated by circular buffer; go to artifacts for details ...> | ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ Wraps: (4) secondary error attachment | signal: interrupt | (1) signal: interrupt | Error types: (1) *exec.ExitError Wraps: (5) context canceled Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *main.withCommandDetails (4) *secondary.withSecondaryError (5) *errors.errorString ``` <details><summary>Reproduce</summary> <p> <p>To reproduce, try: ```bash # From https://go.crdb.dev/p/roachstress, perhaps edited lightly. caffeinate ./roachstress.sh jepsen/g2/start-kill-2 ``` </p> </p> </details> /cc @cockroachdb/kv <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/g2/start-kill-2.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
non_usab
roachtest jepsen start kill failed roachtest jepsen start kill with on release the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts jepsen start kill run cluster go jepsen go jepsen go jepsen go test runner go output in run sh home agent work go src github com cockroachdb cockroach bin roachprod run teamcity sh c sudo apt get y upgrade o dpkg options force confold logs apt upgrade log returned context canceled attached stack trace stack trace main cluster rune home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main cluster run home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main initjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest jepsen go main runjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest jepsen go main registerjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest jepsen go main testrunner runtest home agent work go src github com cockroachdb cockroach pkg cmd roachtest test runner go runtime goexit usr local go src runtime asm s wraps output in run sh wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity sh c sudo apt get y upgrade o dpkg options force confold logs apt upgrade log returned stderr stdout wraps secondary error attachment signal interrupt signal interrupt error types exec exiterror wraps context canceled error types withstack withstack errutil withprefix main withcommanddetails secondary withsecondaryerror errors errorstring reproduce to reproduce try bash from perhaps edited lightly caffeinate roachstress sh jepsen start kill cc cockroachdb kv
0
20,708
15,897,606,891
IssuesEvent
2021-04-11 21:46:54
andrewfstratton/quando
https://api.github.com/repos/andrewfstratton/quando
closed
Add link to nodeRED to hub
enhancement usability
Include link to :1880 - from Java (?) - check to see if it running? - maybe add start/stop/kill - or (probably) set this in the runtime configuration
True
Add link to nodeRED to hub - Include link to :1880 - from Java (?) - check to see if it running? - maybe add start/stop/kill - or (probably) set this in the runtime configuration
usab
add link to nodered to hub include link to from java check to see if it running maybe add start stop kill or probably set this in the runtime configuration
1
191,873
6,844,513,675
IssuesEvent
2017-11-13 02:01:08
Squalr/Squalr
https://api.github.com/repos/Squalr/Squalr
opened
Implement HexDecBox in WPF
feature low priority
Right now these are a control created via Windows Forms Hosting. In WPF this could be much more fancy -- ie some sort of visual indication when the value is in hex (ie a transparent box with the letter H on the very right side of the HexDecBox). We could also make the context menus match the style of the rest of the application. The XAML would be much more readable as well.
1.0
Implement HexDecBox in WPF - Right now these are a control created via Windows Forms Hosting. In WPF this could be much more fancy -- ie some sort of visual indication when the value is in hex (ie a transparent box with the letter H on the very right side of the HexDecBox). We could also make the context menus match the style of the rest of the application. The XAML would be much more readable as well.
non_usab
implement hexdecbox in wpf right now these are a control created via windows forms hosting in wpf this could be much more fancy ie some sort of visual indication when the value is in hex ie a transparent box with the letter h on the very right side of the hexdecbox we could also make the context menus match the style of the rest of the application the xaml would be much more readable as well
0
41,389
6,902,359,926
IssuesEvent
2017-11-25 19:37:23
time2backup/time2backup
https://api.github.com/repos/time2backup/time2backup
closed
Subdirectory created when restoring a deleted directory
documentation
If user restores a deleted directory, e.g. `test`, it will restore it in `test/test/` Note: path of the deleted directory is called in the command line.
1.0
Subdirectory created when restoring a deleted directory - If user restores a deleted directory, e.g. `test`, it will restore it in `test/test/` Note: path of the deleted directory is called in the command line.
non_usab
subdirectory created when restoring a deleted directory if user restores a deleted directory e g test it will restore it in test test note path of the deleted directory is called in the command line
0
446,912
31,563,611,320
IssuesEvent
2023-09-03 14:48:50
cabralpinto/modular-diffusion
https://api.github.com/repos/cabralpinto/modular-diffusion
opened
Fix issue on mobile where some documentation pages become wider than the screen
bug documentation help wanted
The pages in question are: - https://cabralpinto.github.io/modular-diffusion/guides/custom-modules/ - https://cabralpinto.github.io/modular-diffusion/modules/noise-type/ - https://cabralpinto.github.io/modular-diffusion/modules/loss-function/ The issue is caused by LaTeX expressions that don't fit smaller screens. I'm not quite sure how this could be fixed.
1.0
Fix issue on mobile where some documentation pages become wider than the screen - The pages in question are: - https://cabralpinto.github.io/modular-diffusion/guides/custom-modules/ - https://cabralpinto.github.io/modular-diffusion/modules/noise-type/ - https://cabralpinto.github.io/modular-diffusion/modules/loss-function/ The issue is caused by LaTeX expressions that don't fit smaller screens. I'm not quite sure how this could be fixed.
non_usab
fix issue on mobile where some documentation pages become wider than the screen the pages in question are the issue is caused by latex expressions that don t fit smaller screens i m not quite sure how this could be fixed
0
20,062
14,972,278,145
IssuesEvent
2021-01-27 22:33:32
DynEntTech/CBKnownIssues
https://api.github.com/repos/DynEntTech/CBKnownIssues
opened
Inventory, Build Assembly, no way to reverse it
CB 18.2 CB 21.0 Inventory Usability
If you have an assembly of 50 items, and you build the wrong assembly by accident, or, you get a return and want to put the items back into stock, there is no way to reverse an assembly build, except to enter 51 stock adjustments, One stock adjustment out (of whatever the build quantity is) and then 50 stock adjustments in (of whatever the build quantity was multiplied by the quantity needed of that item). Here is the official CB KB article documenting this: https://my.connectedbusiness.com/index.php?rp=/knowledgebase/104/How-to-Delete-or-Remove-Build-Assembly.html
True
Inventory, Build Assembly, no way to reverse it - If you have an assembly of 50 items, and you build the wrong assembly by accident, or, you get a return and want to put the items back into stock, there is no way to reverse an assembly build, except to enter 51 stock adjustments, One stock adjustment out (of whatever the build quantity is) and then 50 stock adjustments in (of whatever the build quantity was multiplied by the quantity needed of that item). Here is the official CB KB article documenting this: https://my.connectedbusiness.com/index.php?rp=/knowledgebase/104/How-to-Delete-or-Remove-Build-Assembly.html
usab
inventory build assembly no way to reverse it if you have an assembly of items and you build the wrong assembly by accident or you get a return and want to put the items back into stock there is no way to reverse an assembly build except to enter stock adjustments one stock adjustment out of whatever the build quantity is and then stock adjustments in of whatever the build quantity was multiplied by the quantity needed of that item here is the official cb kb article documenting this
1
11,058
7,045,408,515
IssuesEvent
2018-01-01 19:18:17
FAForever/client
https://api.github.com/repos/FAForever/client
closed
Client needs a "beta version track"
enhancement usability
It is currently hard to get out test / beta version of the client to users without potentially breaking their shit because the only real way to get lots of users to use a new client release is to release it as a release version, forcing everyone to update and preventing them from running old client versions. There should be a "beta track" or "beta release" mechanism that prompts users to upgrade to beta versions (with opt-in / opt-out setting?) without stopping them from downgrading in case of bugs. This change would have to be supported by the server.
True
Client needs a "beta version track" - It is currently hard to get out test / beta version of the client to users without potentially breaking their shit because the only real way to get lots of users to use a new client release is to release it as a release version, forcing everyone to update and preventing them from running old client versions. There should be a "beta track" or "beta release" mechanism that prompts users to upgrade to beta versions (with opt-in / opt-out setting?) without stopping them from downgrading in case of bugs. This change would have to be supported by the server.
usab
client needs a beta version track it is currently hard to get out test beta version of the client to users without potentially breaking their shit because the only real way to get lots of users to use a new client release is to release it as a release version forcing everyone to update and preventing them from running old client versions there should be a beta track or beta release mechanism that prompts users to upgrade to beta versions with opt in opt out setting without stopping them from downgrading in case of bugs this change would have to be supported by the server
1
233,304
18,959,484,969
IssuesEvent
2021-11-19 01:39:38
tracer-protocol/perpetual-pools-contracts
https://api.github.com/repos/tracer-protocol/perpetual-pools-contracts
opened
Write up enforced structure of test suite
triage tests
- Currently the test suit is disorganised. - We need to settle on a rigorous way to organise and structure the test. - We then need to refactor tests - We then need to collectively enforce it
1.0
Write up enforced structure of test suite - - Currently the test suit is disorganised. - We need to settle on a rigorous way to organise and structure the test. - We then need to refactor tests - We then need to collectively enforce it
non_usab
write up enforced structure of test suite currently the test suit is disorganised we need to settle on a rigorous way to organise and structure the test we then need to refactor tests we then need to collectively enforce it
0
11,281
7,135,404,677
IssuesEvent
2018-01-23 00:54:49
postmarketOS/pmbootstrap
https://api.github.com/repos/postmarketOS/pmbootstrap
closed
Check if it is possible to create device nodes in the work folder
enhancement hacktoberfest help wanted pmbootstrap usability
We've had a few cases where the device node creation did not work. It would be nice if we check that during `pmbootstrap init` and tell the user what could cause this and what to do about it (use another location, link to a wiki page with details). Possible causes: * Shared VM folder * NTFS folder? * Old versions of ecryptfs * maybe encfs
True
Check if it is possible to create device nodes in the work folder - We've had a few cases where the device node creation did not work. It would be nice if we check that during `pmbootstrap init` and tell the user what could cause this and what to do about it (use another location, link to a wiki page with details). Possible causes: * Shared VM folder * NTFS folder? * Old versions of ecryptfs * maybe encfs
usab
check if it is possible to create device nodes in the work folder we ve had a few cases where the device node creation did not work it would be nice if we check that during pmbootstrap init and tell the user what could cause this and what to do about it use another location link to a wiki page with details possible causes shared vm folder ntfs folder old versions of ecryptfs maybe encfs
1
26,749
27,147,900,790
IssuesEvent
2023-02-16 21:38:38
informalsystems/quint
https://api.github.com/repos/informalsystems/quint
opened
Vintage syntax highlighting
usability
Since we have low chances for having Github syntax highlighting for Quint in the near future, we could automatically produce a markdown page that uses the restricted markup. I have managed to produce something that looks like syntax highlighting in the old textbooks. Very vintage :) **module**&nbsp;HighlightJS&nbsp;**{** &nbsp;&nbsp;**//**&nbsp;<code>single-line comments</code> &nbsp;&nbsp;**type**&nbsp;Addr&nbsp;**=**&nbsp;<ins>str</ins> &nbsp;&nbsp;**pure**&nbsp;**val**&nbsp;MAX_UINT **=** <code>2</code>**^**<code>256</code> **-** <code>1</code> &nbsp;&nbsp;**const**&nbsp;ADDR:&nbsp;<ins>Set</ins>[Addr] &nbsp;&nbsp;**action**&nbsp;init:&nbsp;<ins>bool</ins> = { &nbsp;&nbsp;&nbsp;&nbsp;**nondet**&nbsp;sender&nbsp; = &nbsp;oneOf(ADDR) &nbsp;&nbsp;&nbsp;&nbsp;**all**&nbsp; { &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;balances'&nbsp;**=**&nbsp;ADDR.mapBy(a &gt; <code>0</code>) &nbsp;&nbsp;&nbsp;&nbsp;} &nbsp;&nbsp;} &nbsp;&nbsp;**run**&nbsp;mintSendTest&nbsp;=&nbsp;{ &nbsp;&nbsp;&nbsp;&nbsp;init.then(mint(<code>"alice"</code>, <code>"bob"</code>, <code>10</code>)) &nbsp;&nbsp;} **}**
True
Vintage syntax highlighting - Since we have low chances for having Github syntax highlighting for Quint in the near future, we could automatically produce a markdown page that uses the restricted markup. I have managed to produce something that looks like syntax highlighting in the old textbooks. Very vintage :) **module**&nbsp;HighlightJS&nbsp;**{** &nbsp;&nbsp;**//**&nbsp;<code>single-line comments</code> &nbsp;&nbsp;**type**&nbsp;Addr&nbsp;**=**&nbsp;<ins>str</ins> &nbsp;&nbsp;**pure**&nbsp;**val**&nbsp;MAX_UINT **=** <code>2</code>**^**<code>256</code> **-** <code>1</code> &nbsp;&nbsp;**const**&nbsp;ADDR:&nbsp;<ins>Set</ins>[Addr] &nbsp;&nbsp;**action**&nbsp;init:&nbsp;<ins>bool</ins> = { &nbsp;&nbsp;&nbsp;&nbsp;**nondet**&nbsp;sender&nbsp; = &nbsp;oneOf(ADDR) &nbsp;&nbsp;&nbsp;&nbsp;**all**&nbsp; { &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;balances'&nbsp;**=**&nbsp;ADDR.mapBy(a &gt; <code>0</code>) &nbsp;&nbsp;&nbsp;&nbsp;} &nbsp;&nbsp;} &nbsp;&nbsp;**run**&nbsp;mintSendTest&nbsp;=&nbsp;{ &nbsp;&nbsp;&nbsp;&nbsp;init.then(mint(<code>"alice"</code>, <code>"bob"</code>, <code>10</code>)) &nbsp;&nbsp;} **}**
usab
vintage syntax highlighting since we have low chances for having github syntax highlighting for quint in the near future we could automatically produce a markdown page that uses the restricted markup i have managed to produce something that looks like syntax highlighting in the old textbooks very vintage module nbsp highlightjs nbsp nbsp nbsp nbsp single line comments nbsp nbsp type nbsp addr nbsp nbsp str nbsp nbsp pure nbsp val nbsp max uint nbsp nbsp const nbsp addr nbsp set nbsp nbsp action nbsp init nbsp bool nbsp nbsp nbsp nbsp nondet nbsp sender nbsp nbsp oneof addr nbsp nbsp nbsp nbsp all nbsp nbsp nbsp nbsp nbsp nbsp nbsp balances nbsp nbsp addr mapby a gt nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp run nbsp mintsendtest nbsp nbsp nbsp nbsp nbsp nbsp init then mint alice bob nbsp nbsp
1
181,112
14,852,307,163
IssuesEvent
2021-01-18 08:24:21
gbv/login-server
https://api.github.com/repos/gbv/login-server
opened
Add document/page with detailed explanation about Login Server
documentation
From #27: > Add an end-user friendly help page that explains the purpose and functionality of login-server and single-sign-on with basic concepts such as > > Identity providers (e.g. ORCID) > Identities > Client applications (Cocoda) > Sessions > Service providers (e.g. jskos-server, Wikidata-write-access...) > API Either a document inside the repo (that could then be linked from the /help page), or a separate sub-page.
1.0
Add document/page with detailed explanation about Login Server - From #27: > Add an end-user friendly help page that explains the purpose and functionality of login-server and single-sign-on with basic concepts such as > > Identity providers (e.g. ORCID) > Identities > Client applications (Cocoda) > Sessions > Service providers (e.g. jskos-server, Wikidata-write-access...) > API Either a document inside the repo (that could then be linked from the /help page), or a separate sub-page.
non_usab
add document page with detailed explanation about login server from add an end user friendly help page that explains the purpose and functionality of login server and single sign on with basic concepts such as identity providers e g orcid identities client applications cocoda sessions service providers e g jskos server wikidata write access api either a document inside the repo that could then be linked from the help page or a separate sub page
0
13,612
8,615,023,577
IssuesEvent
2018-11-19 19:16:14
coreos/ignition
https://api.github.com/repos/coreos/ignition
closed
Using noCreateHome causes boot to fail
area/usability kind/friction
# Bug # noCreateHome causes boot to fail. If noCreateHome is removed from ignition file, the system boots as expected. Could this be because the ignition file is trying to write ssh keys but there is no home directory created? ## Operating System Version ## RHCOS maipo 47.94 ## Ignition Version ## ignition-0.28.0-10.gitf707912.el7.x86_64 ## Environment ## libvirt ## Expected Behavior ## no home directory is created when the user is created through ignition and system boots ## Actual Behavior ## system does not get booted when noCreateHome is used ## Reproduction Steps ## 1. Use the following ignition file ``` { "ignition": { "config": {}, "timeouts": {}, "version": "2.1.0" }, "networkd": {}, "passwd": { "users": [ { "groups": [ "wheel" ], "name": "user1", "noCreateHome": true, "sshAuthorizedKeys": [ "ssh-rsa ..." ], "shell": "/bin/bash" } ] }, "storage": {}, "systemd": {} } ``` 2. `sudo virt-install -n rhcos1 --vcpus 2 -r 4096 --os-type=linux --import --network network=default --disk=redhat-coreos-maipo-47.94-qemu.qcow2 --noautoconsole --qemu-commandline="-fw_cfg name=opt/com.coreos/config,file=/path/to/ign.ign` ## Other Information ## ``` [ 70.249840] systemd[1]: Starting dracut cmdline hook... [ 70.250150] systemd[1]: Starting Ignition (files)... [ 70.254011] ignition[20829]: INFO : Ignition 0.28.0 [ 70.254366] ignition[20829]: INFO : reading system config file "/usr/lib/ignition/base.ign" [ 70.254586] ignition[20829]: INFO : no config at "/usr/lib/ignition/base.ign" [ 70.257693] ignition[20829]: INFO : files: createUsers: op(1): [started] creating or modifying user "user1" [ 70.257975] ignition[20829]: DEBUG : files: createUsers: op(1): executing: "/usr/sbin/usermod" "--root" "/sysroot" "--groups" "wheel" "--shell" "/bin/bash" "user1" [ 70.262619] ignition[20829]: INFO : files: createUsers: op(1): [finished] creating or modifying user "user1" [ 70.262921] ignition[20829]: INFO : files: createUsers: op(2): [started] adding ssh keys to user "user1" [ 70.263682] ignition[20829]: CRITICAL : files: createUsers: op(2): [failed] adding ssh keys to user "user1": no such file or directory [ 70.263903] ignition[20829]: files failedCRITICAL : Ignition failed: failed to create users/groups: failed to create users: failed to add keys to user "user1": no such file or directory Starting Ignition (files)... [ OK ] Reached target Paths. [ 70.330037] systemd[1]: Reached target Paths. Starting Apply K[ 70.335278] systemd[1]: Starting Paths. ernel Variables... [ OK [ 70.338369] systemd[1]: Starting Apply Kernel Variables... ] Started Create list of required sta...ce nodes for the current kernel. [FAILED] Failed to start Ignition (files). See 'systemctl status ignition-files.service' for details. ```
True
Using noCreateHome causes boot to fail - # Bug # noCreateHome causes boot to fail. If noCreateHome is removed from ignition file, the system boots as expected. Could this be because the ignition file is trying to write ssh keys but there is no home directory created? ## Operating System Version ## RHCOS maipo 47.94 ## Ignition Version ## ignition-0.28.0-10.gitf707912.el7.x86_64 ## Environment ## libvirt ## Expected Behavior ## no home directory is created when the user is created through ignition and system boots ## Actual Behavior ## system does not get booted when noCreateHome is used ## Reproduction Steps ## 1. Use the following ignition file ``` { "ignition": { "config": {}, "timeouts": {}, "version": "2.1.0" }, "networkd": {}, "passwd": { "users": [ { "groups": [ "wheel" ], "name": "user1", "noCreateHome": true, "sshAuthorizedKeys": [ "ssh-rsa ..." ], "shell": "/bin/bash" } ] }, "storage": {}, "systemd": {} } ``` 2. `sudo virt-install -n rhcos1 --vcpus 2 -r 4096 --os-type=linux --import --network network=default --disk=redhat-coreos-maipo-47.94-qemu.qcow2 --noautoconsole --qemu-commandline="-fw_cfg name=opt/com.coreos/config,file=/path/to/ign.ign` ## Other Information ## ``` [ 70.249840] systemd[1]: Starting dracut cmdline hook... [ 70.250150] systemd[1]: Starting Ignition (files)... [ 70.254011] ignition[20829]: INFO : Ignition 0.28.0 [ 70.254366] ignition[20829]: INFO : reading system config file "/usr/lib/ignition/base.ign" [ 70.254586] ignition[20829]: INFO : no config at "/usr/lib/ignition/base.ign" [ 70.257693] ignition[20829]: INFO : files: createUsers: op(1): [started] creating or modifying user "user1" [ 70.257975] ignition[20829]: DEBUG : files: createUsers: op(1): executing: "/usr/sbin/usermod" "--root" "/sysroot" "--groups" "wheel" "--shell" "/bin/bash" "user1" [ 70.262619] ignition[20829]: INFO : files: createUsers: op(1): [finished] creating or modifying user "user1" [ 70.262921] ignition[20829]: INFO : files: createUsers: op(2): [started] adding ssh keys to user "user1" [ 70.263682] ignition[20829]: CRITICAL : files: createUsers: op(2): [failed] adding ssh keys to user "user1": no such file or directory [ 70.263903] ignition[20829]: files failedCRITICAL : Ignition failed: failed to create users/groups: failed to create users: failed to add keys to user "user1": no such file or directory Starting Ignition (files)... [ OK ] Reached target Paths. [ 70.330037] systemd[1]: Reached target Paths. Starting Apply K[ 70.335278] systemd[1]: Starting Paths. ernel Variables... [ OK [ 70.338369] systemd[1]: Starting Apply Kernel Variables... ] Started Create list of required sta...ce nodes for the current kernel. [FAILED] Failed to start Ignition (files). See 'systemctl status ignition-files.service' for details. ```
usab
using nocreatehome causes boot to fail bug nocreatehome causes boot to fail if nocreatehome is removed from ignition file the system boots as expected could this be because the ignition file is trying to write ssh keys but there is no home directory created operating system version rhcos maipo ignition version ignition environment libvirt expected behavior no home directory is created when the user is created through ignition and system boots actual behavior system does not get booted when nocreatehome is used reproduction steps use the following ignition file ignition config timeouts version networkd passwd users groups wheel name nocreatehome true sshauthorizedkeys ssh rsa shell bin bash storage systemd sudo virt install n vcpus r os type linux import network network default disk redhat coreos maipo qemu noautoconsole qemu commandline fw cfg name opt com coreos config file path to ign ign other information systemd starting dracut cmdline hook systemd starting ignition files ignition info ignition ignition info reading system config file usr lib ignition base ign ignition info no config at usr lib ignition base ign ignition info files createusers op creating or modifying user ignition debug files createusers op executing usr sbin usermod root sysroot groups wheel shell bin bash ignition info files createusers op creating or modifying user ignition info files createusers op adding ssh keys to user ignition critical files createusers op adding ssh keys to user no such file or directory ignition files failedcritical ignition failed failed to create users groups failed to create users failed to add keys to user no such file or directory starting ignition files reached target paths systemd reached target paths starting apply k systemd starting paths ernel variables systemd starting apply kernel variables started create list of required sta ce nodes for the current kernel failed to start ignition files see systemctl status ignition files service for details
1
7,610
5,094,889,767
IssuesEvent
2017-01-03 13:22:34
ocaml-doc/odoc
https://api.github.com/repos/ocaml-doc/odoc
closed
Lost doc string
bug doc-usability important
Quite surprisingly, as far as I know this is not an attachement problem: http://docs.mirage.io/react/React/E/index.html#val-select http://erratique.ch/software/react/doc/React.E.html#VALselect The source is here https://github.com/dbuenzli/react/blob/1a4a86e4fa8d67bd59c77cc3180a1718fdb84b18/src/react.mli#L201-L208
True
Lost doc string - Quite surprisingly, as far as I know this is not an attachement problem: http://docs.mirage.io/react/React/E/index.html#val-select http://erratique.ch/software/react/doc/React.E.html#VALselect The source is here https://github.com/dbuenzli/react/blob/1a4a86e4fa8d67bd59c77cc3180a1718fdb84b18/src/react.mli#L201-L208
usab
lost doc string quite surprisingly as far as i know this is not an attachement problem the source is here
1
11,027
7,033,062,282
IssuesEvent
2017-12-27 08:39:11
bwsw/cloudstack-ui
https://api.github.com/repos/bwsw/cloudstack-ui
closed
Extra space appears when grouping VM by colors
bug statistics bug: usability
Steps to reproduce: 1. Go to VM list 2. Click group by colors Expected result: VMs are grouped Actual result: VMs are grouped, extra space appears above the list - list moves down a little bit **Connected feature ID:** _vm_list_groupby_
True
Extra space appears when grouping VM by colors - Steps to reproduce: 1. Go to VM list 2. Click group by colors Expected result: VMs are grouped Actual result: VMs are grouped, extra space appears above the list - list moves down a little bit **Connected feature ID:** _vm_list_groupby_
usab
extra space appears when grouping vm by colors steps to reproduce go to vm list click group by colors expected result vms are grouped actual result vms are grouped extra space appears above the list list moves down a little bit connected feature id vm list groupby
1
330,885
10,056,951,157
IssuesEvent
2019-07-22 10:21:08
svof/svof
https://api.github.com/repos/svof/svof
closed
Having devil tarot on keepup might spam out in Aeon/Retardation
enhancement low priority medium difficulty stale up for grabs
Devil tarot defup uses sendAll. This may be dangerous in slowcuring situations. A possible alternative would be to add tarot cards to precache. See https://github.com/svof/svof/pull/174#issuecomment-209736556
1.0
Having devil tarot on keepup might spam out in Aeon/Retardation - Devil tarot defup uses sendAll. This may be dangerous in slowcuring situations. A possible alternative would be to add tarot cards to precache. See https://github.com/svof/svof/pull/174#issuecomment-209736556
non_usab
having devil tarot on keepup might spam out in aeon retardation devil tarot defup uses sendall this may be dangerous in slowcuring situations a possible alternative would be to add tarot cards to precache see
0
11,082
27,975,046,959
IssuesEvent
2023-03-25 13:28:23
PandaHugMonster/php-simputils
https://api.github.com/repos/PandaHugMonster/php-simputils
closed
New models "Email" and "Phone"
idea architecture model
Could be useful to create a few new models like "Email" and "Phone" and corresponding Normalization/Validation - [ ] `Email` - [ ] `Phone`
1.0
New models "Email" and "Phone" - Could be useful to create a few new models like "Email" and "Phone" and corresponding Normalization/Validation - [ ] `Email` - [ ] `Phone`
non_usab
new models email and phone could be useful to create a few new models like email and phone and corresponding normalization validation email phone
0
15,418
10,013,993,785
IssuesEvent
2019-07-15 16:23:53
rabbitmq/rabbitmq-server
https://api.github.com/repos/rabbitmq/rabbitmq-server
closed
Add more information on node health check timeout
enhancement usability
In [this thread](https://groups.google.com/forum/#!topic/rabbitmq-users/-vrONeNLuE8) user was confused by timeout in node health check. We should provide more information about what operation timed out and possible reasons.
True
Add more information on node health check timeout - In [this thread](https://groups.google.com/forum/#!topic/rabbitmq-users/-vrONeNLuE8) user was confused by timeout in node health check. We should provide more information about what operation timed out and possible reasons.
usab
add more information on node health check timeout in user was confused by timeout in node health check we should provide more information about what operation timed out and possible reasons
1
27,547
29,510,202,622
IssuesEvent
2023-06-03 20:47:00
tailscale/tailscale
https://api.github.com/repos/tailscale/tailscale
closed
Android cannot connect to exit node
OS-android L1 Very few P2 Aggravating T5 Usability exit-node bug
### What is the issue? After connecting to an exit node (both win & Linux exit node as server tried), all traffic died. I can't visit any website or ping my devices. But after disabling exit node, all works fine. Subnet (as client) function can work well on this devices (Linux as subnet server) without turning on exit node. ### Steps to reproduce Turn on vpn, turn on exit node and choose a server. ### Are there any recent changes that introduced the issue? No, I've tried a fresh reinstall but it didn't help. Meanwhile, other devices (win as client) can use exit node servers without problem. ### OS Android ### OS version Android12, MIUI13 ### Tailscale version 1.30.2 ### Bug report BUG-668690cb86a2d54ffe8adeffb0075777d71f1b22edcb5300fd7e92052d1aa2f8-20220926102628Z-371b958f02d8acec
True
Android cannot connect to exit node - ### What is the issue? After connecting to an exit node (both win & Linux exit node as server tried), all traffic died. I can't visit any website or ping my devices. But after disabling exit node, all works fine. Subnet (as client) function can work well on this devices (Linux as subnet server) without turning on exit node. ### Steps to reproduce Turn on vpn, turn on exit node and choose a server. ### Are there any recent changes that introduced the issue? No, I've tried a fresh reinstall but it didn't help. Meanwhile, other devices (win as client) can use exit node servers without problem. ### OS Android ### OS version Android12, MIUI13 ### Tailscale version 1.30.2 ### Bug report BUG-668690cb86a2d54ffe8adeffb0075777d71f1b22edcb5300fd7e92052d1aa2f8-20220926102628Z-371b958f02d8acec
usab
android cannot connect to exit node what is the issue after connecting to an exit node both win linux exit node as server tried all traffic died i can t visit any website or ping my devices but after disabling exit node all works fine subnet as client function can work well on this devices linux as subnet server without turning on exit node steps to reproduce turn on vpn turn on exit node and choose a server are there any recent changes that introduced the issue no i ve tried a fresh reinstall but it didn t help meanwhile other devices win as client can use exit node servers without problem os android os version tailscale version bug report bug
1
3,257
3,384,714,272
IssuesEvent
2015-11-27 06:13:50
Virtual-Labs/problem-solving-iiith
https://api.github.com/repos/Virtual-Labs/problem-solving-iiith
opened
QA_Advanced Arithmatic_I_85
26-11-2015 IIIT Hyd Linux Open Release Number S2 Usability Version Number Windows 7
Defect Description: In the introduction page of " Advanced Arithmatic " experiment,the alignment of the feedback is not clear instead feedback link should be a header option like introduction and theory. Actual Result: In the introduction page of " Advanced Arithmatic " experiment,the alignment of the feedback link is not clear. Test Step Link: ![2](https://cloud.githubusercontent.com/assets/14869397/11435058/0b83d4ce-94fc-11e5-932c-6882e20fd841.png)
True
QA_Advanced Arithmatic_I_85 - Defect Description: In the introduction page of " Advanced Arithmatic " experiment,the alignment of the feedback is not clear instead feedback link should be a header option like introduction and theory. Actual Result: In the introduction page of " Advanced Arithmatic " experiment,the alignment of the feedback link is not clear. Test Step Link: ![2](https://cloud.githubusercontent.com/assets/14869397/11435058/0b83d4ce-94fc-11e5-932c-6882e20fd841.png)
usab
qa advanced arithmatic i defect description in the introduction page of advanced arithmatic experiment the alignment of the feedback is not clear instead feedback link should be a header option like introduction and theory actual result in the introduction page of advanced arithmatic experiment the alignment of the feedback link is not clear test step link
1
17,608
12,193,682,708
IssuesEvent
2020-04-29 14:45:01
solo-io/gloo
https://api.github.com/repos/solo-io/gloo
closed
Add loadBalancerIP field to knative-external-proxy
Area: Helm Area: Usability Type: Enhancement
The `gateway-proxy` can be configured with a `loadBalancerIP` (see: https://github.com/solo-io/gloo/blob/master/install/helm/gloo/templates/8-gateway-proxy-service.yaml#L52). It would be nice to add the same type of configuration to the `knative-external-proxy`. It currently does not allow you to define a `loadBalancerIP` (see: https://github.com/solo-io/gloo/blob/master/install/helm/gloo/templates/28-knative-external-proxy-service.yaml), you have to manually patch/apply this after deployment. Why would this be useful? If you have a static IP address (lets say, from a cloud provider), it is handy to configure a proxy service with this IP address, instead of getting a dynamic allocated IP address. I can try to contribute and submit a merge request if you think this is a good idea.
True
Add loadBalancerIP field to knative-external-proxy - The `gateway-proxy` can be configured with a `loadBalancerIP` (see: https://github.com/solo-io/gloo/blob/master/install/helm/gloo/templates/8-gateway-proxy-service.yaml#L52). It would be nice to add the same type of configuration to the `knative-external-proxy`. It currently does not allow you to define a `loadBalancerIP` (see: https://github.com/solo-io/gloo/blob/master/install/helm/gloo/templates/28-knative-external-proxy-service.yaml), you have to manually patch/apply this after deployment. Why would this be useful? If you have a static IP address (lets say, from a cloud provider), it is handy to configure a proxy service with this IP address, instead of getting a dynamic allocated IP address. I can try to contribute and submit a merge request if you think this is a good idea.
usab
add loadbalancerip field to knative external proxy the gateway proxy can be configured with a loadbalancerip see it would be nice to add the same type of configuration to the knative external proxy it currently does not allow you to define a loadbalancerip see you have to manually patch apply this after deployment why would this be useful if you have a static ip address lets say from a cloud provider it is handy to configure a proxy service with this ip address instead of getting a dynamic allocated ip address i can try to contribute and submit a merge request if you think this is a good idea
1
21,921
18,074,009,442
IssuesEvent
2021-09-21 07:48:30
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Tab key can't be used in editor shortcuts.
bug topic:editor usability
**Operating system or device - Godot version:** Linux - Master **Issue description:** When pressing an Editor shortcut that involves the `Tab` key, Godot will focus the next `control` instead of activating the shortcut. Easily testable with the default "next/prev scene" shortcut. **Steps to reproduce:** - Have 2 or more scenes open. - Press `ctrl + tab` or `ctrl + shift + tab` - ... - Don't Profit :(
True
Tab key can't be used in editor shortcuts. - **Operating system or device - Godot version:** Linux - Master **Issue description:** When pressing an Editor shortcut that involves the `Tab` key, Godot will focus the next `control` instead of activating the shortcut. Easily testable with the default "next/prev scene" shortcut. **Steps to reproduce:** - Have 2 or more scenes open. - Press `ctrl + tab` or `ctrl + shift + tab` - ... - Don't Profit :(
usab
tab key can t be used in editor shortcuts operating system or device godot version linux master issue description when pressing an editor shortcut that involves the tab key godot will focus the next control instead of activating the shortcut easily testable with the default next prev scene shortcut steps to reproduce have or more scenes open press ctrl tab or ctrl shift tab don t profit
1
225,065
7,477,522,090
IssuesEvent
2018-04-04 08:35:47
HGustavs/LenaSYS
https://api.github.com/repos/HGustavs/LenaSYS
closed
Submissions of duggas causes problems when non-ascii characters are used
Grupp 3 (2018) Grupp 3 (2018) Dugga-Editor lowPriority
When submitting files through the dugga system, i.e., using the generic-file-receive-dugga we get a problem with malformed file paths if we have Swedish characters in the filename or file path. - **min_fil.zip** works - **min_lรฅda.zip** does not work
1.0
Submissions of duggas causes problems when non-ascii characters are used - When submitting files through the dugga system, i.e., using the generic-file-receive-dugga we get a problem with malformed file paths if we have Swedish characters in the filename or file path. - **min_fil.zip** works - **min_lรฅda.zip** does not work
non_usab
submissions of duggas causes problems when non ascii characters are used when submitting files through the dugga system i e using the generic file receive dugga we get a problem with malformed file paths if we have swedish characters in the filename or file path min fil zip works min lรฅda zip does not work
0
129,393
12,406,363,698
IssuesEvent
2020-05-21 19:00:05
GreenleafLab/ArchR
https://api.github.com/repos/GreenleafLab/ArchR
closed
Typo in intro to Chapter 4 (Dimensionality Reduction)
documentation
The last paragraph of the Dimensionality Reduction intro has "features" misspelled as "featres".
1.0
Typo in intro to Chapter 4 (Dimensionality Reduction) - The last paragraph of the Dimensionality Reduction intro has "features" misspelled as "featres".
non_usab
typo in intro to chapter dimensionality reduction the last paragraph of the dimensionality reduction intro has features misspelled as featres
0
19,948
14,788,648,377
IssuesEvent
2021-01-12 09:30:35
openscd/open-scd
https://api.github.com/repos/openscd/open-scd
closed
Add complex actions
enhancement usability
The log easily gets flooded with many individually un- and redoable entries when a single user action causes many simple editor actions (`Create`, `Update`, `Move`, or `Delete`) to happen at once. A complex editor action type which consists in an array of simple editor actions that are handled and committed to the log as a single un- and redoable entry is intended to help alleviate this issue.
True
Add complex actions - The log easily gets flooded with many individually un- and redoable entries when a single user action causes many simple editor actions (`Create`, `Update`, `Move`, or `Delete`) to happen at once. A complex editor action type which consists in an array of simple editor actions that are handled and committed to the log as a single un- and redoable entry is intended to help alleviate this issue.
usab
add complex actions the log easily gets flooded with many individually un and redoable entries when a single user action causes many simple editor actions create update move or delete to happen at once a complex editor action type which consists in an array of simple editor actions that are handled and committed to the log as a single un and redoable entry is intended to help alleviate this issue
1
36,374
7,919,900,083
IssuesEvent
2018-07-04 19:44:24
Automattic/wp-calypso
https://api.github.com/repos/Automattic/wp-calypso
closed
Unable to renew domains from Purchases page
Domains Purchases [Type] Defect
Unable to renew domains from https://wordpress.com/me/purchases Clicking "Renew Now" does nothing. #### Steps to reproduce 1. Starting at URL: wordpress.com/me/purchases 2. Click on the expired domain 3. Click on "Renew Now" #### What I expected Get to checkout. #### What happened instead Clicking Renew does nothing #### Browser / OS version Tested on Chrome and Safari. #### Screenshot / Video Clicking Renew Now does nothing: <img width="680" alt="screen shot 2018-07-04 at 11 09 45 pm" src="https://user-images.githubusercontent.com/6505293/42290549-5f4bf818-7fdf-11e8-9865-b37ceadc57fc.png"> AND.... Shows this error in JavaScript console: **Safari:** <img width="363" alt="screen shot 2018-07-04 at 11 11 04 pm" src="https://user-images.githubusercontent.com/6505293/42290575-8d942cfe-7fdf-11e8-811f-6e93f92ead9b.png"> and **Chrome:** <img width="535" alt="screen shot 2018-07-04 at 11 10 48 pm" src="https://user-images.githubusercontent.com/6505293/42290603-c392b3f2-7fdf-11e8-8291-7d83985a0334.png"> #### Context / Source Did #manual-testing after a user was having trouble renewing #user-report .
1.0
Unable to renew domains from Purchases page - Unable to renew domains from https://wordpress.com/me/purchases Clicking "Renew Now" does nothing. #### Steps to reproduce 1. Starting at URL: wordpress.com/me/purchases 2. Click on the expired domain 3. Click on "Renew Now" #### What I expected Get to checkout. #### What happened instead Clicking Renew does nothing #### Browser / OS version Tested on Chrome and Safari. #### Screenshot / Video Clicking Renew Now does nothing: <img width="680" alt="screen shot 2018-07-04 at 11 09 45 pm" src="https://user-images.githubusercontent.com/6505293/42290549-5f4bf818-7fdf-11e8-9865-b37ceadc57fc.png"> AND.... Shows this error in JavaScript console: **Safari:** <img width="363" alt="screen shot 2018-07-04 at 11 11 04 pm" src="https://user-images.githubusercontent.com/6505293/42290575-8d942cfe-7fdf-11e8-811f-6e93f92ead9b.png"> and **Chrome:** <img width="535" alt="screen shot 2018-07-04 at 11 10 48 pm" src="https://user-images.githubusercontent.com/6505293/42290603-c392b3f2-7fdf-11e8-8291-7d83985a0334.png"> #### Context / Source Did #manual-testing after a user was having trouble renewing #user-report .
non_usab
unable to renew domains from purchases page unable to renew domains from clicking renew now does nothing steps to reproduce starting at url wordpress com me purchases click on the expired domain click on renew now what i expected get to checkout what happened instead clicking renew does nothing browser os version tested on chrome and safari screenshot video clicking renew now does nothing img width alt screen shot at pm src and shows this error in javascript console safari img width alt screen shot at pm src and chrome img width alt screen shot at pm src context source did manual testing after a user was having trouble renewing user report
0
75,480
25,866,911,546
IssuesEvent
2022-12-13 21:46:29
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
closed
"People" metaspace shows favourite rooms
T-Defect S-Major A-Room-List A-Spaces O-Uncommon
### Steps to reproduce 1. Click on "People" metaspace ### Outcome #### What did you expect? See only people listed in the left panel #### What happened instead? See rooms under "Favourites" listed in the left panel ![Screenshot from 2022-11-28 09-56-04](https://user-images.githubusercontent.com/51663/204248792-e37e68cf-e377-416e-95f0-ecf72f460505.png) ### Operating system _No response_ ### Browser information Chromium 107.0.5304.121 (Official Build) Arch Linux (64-bit) ### URL for webapp develop.element.io ### Application version Element version: cbf5c43ae79d-react-569a36493378-js-b318a77ecef1 Olm version: 3.2.12 ### Homeserver matrix.org ### Will you send logs? No
1.0
"People" metaspace shows favourite rooms - ### Steps to reproduce 1. Click on "People" metaspace ### Outcome #### What did you expect? See only people listed in the left panel #### What happened instead? See rooms under "Favourites" listed in the left panel ![Screenshot from 2022-11-28 09-56-04](https://user-images.githubusercontent.com/51663/204248792-e37e68cf-e377-416e-95f0-ecf72f460505.png) ### Operating system _No response_ ### Browser information Chromium 107.0.5304.121 (Official Build) Arch Linux (64-bit) ### URL for webapp develop.element.io ### Application version Element version: cbf5c43ae79d-react-569a36493378-js-b318a77ecef1 Olm version: 3.2.12 ### Homeserver matrix.org ### Will you send logs? No
non_usab
people metaspace shows favourite rooms steps to reproduce click on people metaspace outcome what did you expect see only people listed in the left panel what happened instead see rooms under favourites listed in the left panel operating system no response browser information chromium official build arch linux bit url for webapp develop element io application version element version react js olm version homeserver matrix org will you send logs no
0
11,449
7,243,052,418
IssuesEvent
2018-02-14 10:17:20
ComputationalRadiationPhysics/clara2
https://api.github.com/repos/ComputationalRadiationPhysics/clara2
opened
Submit script for tianhe2 SLURM
tools usability
As needed by @QJohn2017 a setup script for tianhe2 is needed (see #89). Since tianhe2 uses SLURM for scheduling, either the `./prepare_job` script needs to be adjusted to an `./prepare_job_tianhe2.sh`script which creates a SLURM submit file or we go directly with an submit file that focuses on MPI jobs only (since tianhe2 is large and probably has to handle quite a lot of jobs, MPI is probably the better choice for this system). Additionally, submit scripts for other clusters (taurus, PizDaint, etc.) should be provided. @QJohn2017 Are you planing to submit with MPI only or are you also considering running SLURM Array jobs? If only an MPI job is planned, a simple submit script should be sufficient.
True
Submit script for tianhe2 SLURM - As needed by @QJohn2017 a setup script for tianhe2 is needed (see #89). Since tianhe2 uses SLURM for scheduling, either the `./prepare_job` script needs to be adjusted to an `./prepare_job_tianhe2.sh`script which creates a SLURM submit file or we go directly with an submit file that focuses on MPI jobs only (since tianhe2 is large and probably has to handle quite a lot of jobs, MPI is probably the better choice for this system). Additionally, submit scripts for other clusters (taurus, PizDaint, etc.) should be provided. @QJohn2017 Are you planing to submit with MPI only or are you also considering running SLURM Array jobs? If only an MPI job is planned, a simple submit script should be sufficient.
usab
submit script for slurm as needed by a setup script for is needed see since uses slurm for scheduling either the prepare job script needs to be adjusted to an prepare job sh script which creates a slurm submit file or we go directly with an submit file that focuses on mpi jobs only since is large and probably has to handle quite a lot of jobs mpi is probably the better choice for this system additionally submit scripts for other clusters taurus pizdaint etc should be provided are you planing to submit with mpi only or are you also considering running slurm array jobs if only an mpi job is planned a simple submit script should be sufficient
1
181,239
14,857,636,004
IssuesEvent
2021-01-18 15:41:21
Quansight/qhub
https://api.github.com/repos/Quansight/qhub
closed
[documentation] How do I update/edit a conda environment
documentation
I have an existing qhub deployment and I'd like to update the environment and redeploy. I gathered from the docs that I should update `/qhub-config.yaml` which is in the root of my repo. I'm confused why the same envs are in `/environments/environment.yml`. Do I need to update those too? When I updated `qhub-config.yaml` it triggered some gh actions (that I assume came from this qhub setup). It completed several things, but didn't actually update my instance. I'm not sure what I'm missing. Last thing - I have dev ops people on my team which can help me resolve these things, but I think we should improve the docs around maintenance of existing qhub deployments.
1.0
[documentation] How do I update/edit a conda environment - I have an existing qhub deployment and I'd like to update the environment and redeploy. I gathered from the docs that I should update `/qhub-config.yaml` which is in the root of my repo. I'm confused why the same envs are in `/environments/environment.yml`. Do I need to update those too? When I updated `qhub-config.yaml` it triggered some gh actions (that I assume came from this qhub setup). It completed several things, but didn't actually update my instance. I'm not sure what I'm missing. Last thing - I have dev ops people on my team which can help me resolve these things, but I think we should improve the docs around maintenance of existing qhub deployments.
non_usab
how do i update edit a conda environment i have an existing qhub deployment and i d like to update the environment and redeploy i gathered from the docs that i should update qhub config yaml which is in the root of my repo i m confused why the same envs are in environments environment yml do i need to update those too when i updated qhub config yaml it triggered some gh actions that i assume came from this qhub setup it completed several things but didn t actually update my instance i m not sure what i m missing last thing i have dev ops people on my team which can help me resolve these things but i think we should improve the docs around maintenance of existing qhub deployments
0
41,805
6,949,005,886
IssuesEvent
2017-12-06 03:34:17
mapbox/mapbox-gl-native
https://api.github.com/repos/mapbox/mapbox-gl-native
closed
Add snapshot classes to jazzy table of contents
documentation iOS macOS starter-task
MGLMapSnapshotOptions and MGLMapSnapshotter are currently listed under โ€œ[Other Classes](https://www.mapbox.com/ios-sdk/api/3.7.0-alpha.1/Other%20Classes.html)โ€ in the iOS and macOS API referencesโ€™ tables of contents. We should add these classes under โ€œMapsโ€ in both jazzy.yml files so theyโ€™re easier to find. https://github.com/mapbox/mapbox-gl-native/blob/f01588cac78b5e5411385faa451080a74320500b/platform/ios/jazzy.yml#L29-L35 https://github.com/mapbox/mapbox-gl-native/blob/f01588cac78b5e5411385faa451080a74320500b/platform/macos/jazzy.yml#L25-L30 /cc @fabian-guerra
1.0
Add snapshot classes to jazzy table of contents - MGLMapSnapshotOptions and MGLMapSnapshotter are currently listed under โ€œ[Other Classes](https://www.mapbox.com/ios-sdk/api/3.7.0-alpha.1/Other%20Classes.html)โ€ in the iOS and macOS API referencesโ€™ tables of contents. We should add these classes under โ€œMapsโ€ in both jazzy.yml files so theyโ€™re easier to find. https://github.com/mapbox/mapbox-gl-native/blob/f01588cac78b5e5411385faa451080a74320500b/platform/ios/jazzy.yml#L29-L35 https://github.com/mapbox/mapbox-gl-native/blob/f01588cac78b5e5411385faa451080a74320500b/platform/macos/jazzy.yml#L25-L30 /cc @fabian-guerra
non_usab
add snapshot classes to jazzy table of contents mglmapsnapshotoptions and mglmapsnapshotter are currently listed under โ€œ in the ios and macos api referencesโ€™ tables of contents we should add these classes under โ€œmapsโ€ in both jazzy yml files so theyโ€™re easier to find cc fabian guerra
0
9,016
2,615,120,972
IssuesEvent
2015-03-01 05:47:23
chrsmith/google-api-java-client
https://api.github.com/repos/chrsmith/google-api-java-client
closed
GenericUrl path encode/decode methods causing URL errors
auto-migrated Priority-Medium Type-Defect
``` version: 1.4.1-beta on App Engine 1.5.0.1 As an example of this issue, according to Google's documentation to retrieve the ACL Feed for a Google Docs List Entry, the link specified requires the form: https://docs.google.com/feeds/default/private/full/document%3Adocument_id/acl The %3A is critical as the following url (with %3A replaced by a colon) returns a 404 for us: https://docs.google.com/feeds/default/private/full/document:document_id/acl That automatic decoding from %3A to : happens within the internals of the GenericUrl class. As an attempt to work around this, I replaced the "%" with a "%25" with the hopes that GoogleUrl would properly escape the "%25" back to a "%" and all would be well: https://docs.google.com/feeds/default/private/full/document%253Adocument_id/acl However, this only resulted in the GenericUrl building an unchanged, incorrect url: https://docs.google.com/feeds/default/private/full/document%253Adocument_id/acl Since the methods that encode and decode within GenericUrl are final/private, there does not appear to be a way to properly construct a url to access the Google Docs ACL Feed using the class. ``` Original issue reported on code.google.com by `szieg...@cloudsherpas.com` on 14 Jun 2011 at 6:11
1.0
GenericUrl path encode/decode methods causing URL errors - ``` version: 1.4.1-beta on App Engine 1.5.0.1 As an example of this issue, according to Google's documentation to retrieve the ACL Feed for a Google Docs List Entry, the link specified requires the form: https://docs.google.com/feeds/default/private/full/document%3Adocument_id/acl The %3A is critical as the following url (with %3A replaced by a colon) returns a 404 for us: https://docs.google.com/feeds/default/private/full/document:document_id/acl That automatic decoding from %3A to : happens within the internals of the GenericUrl class. As an attempt to work around this, I replaced the "%" with a "%25" with the hopes that GoogleUrl would properly escape the "%25" back to a "%" and all would be well: https://docs.google.com/feeds/default/private/full/document%253Adocument_id/acl However, this only resulted in the GenericUrl building an unchanged, incorrect url: https://docs.google.com/feeds/default/private/full/document%253Adocument_id/acl Since the methods that encode and decode within GenericUrl are final/private, there does not appear to be a way to properly construct a url to access the Google Docs ACL Feed using the class. ``` Original issue reported on code.google.com by `szieg...@cloudsherpas.com` on 14 Jun 2011 at 6:11
non_usab
genericurl path encode decode methods causing url errors version beta on app engine as an example of this issue according to google s documentation to retrieve the acl feed for a google docs list entry the link specified requires the form the is critical as the following url with replaced by a colon returns a for us that automatic decoding from to happens within the internals of the genericurl class as an attempt to work around this i replaced the with a with the hopes that googleurl would properly escape the back to a and all would be well however this only resulted in the genericurl building an unchanged incorrect url since the methods that encode and decode within genericurl are final private there does not appear to be a way to properly construct a url to access the google docs acl feed using the class original issue reported on code google com by szieg cloudsherpas com on jun at
0
24,956
24,520,252,333
IssuesEvent
2022-10-11 08:58:39
ClickHouse/ClickHouse
https://api.github.com/repos/ClickHouse/ClickHouse
opened
Dashboard: title not displayed
usability
For js method `replaceAll` used here: https://github.com/ClickHouse/ClickHouse/blob/7c8e540dea0d1c357d34fb2394ca9a988c3ef8a1/programs/server/dashboard.html#L823 is not supported in chrome version under 85, see https://stackoverflow.com/questions/62825358/javascript-replaceall-is-not-a-function-type-error the charts in dashboard will not display titles.
True
Dashboard: title not displayed - For js method `replaceAll` used here: https://github.com/ClickHouse/ClickHouse/blob/7c8e540dea0d1c357d34fb2394ca9a988c3ef8a1/programs/server/dashboard.html#L823 is not supported in chrome version under 85, see https://stackoverflow.com/questions/62825358/javascript-replaceall-is-not-a-function-type-error the charts in dashboard will not display titles.
usab
dashboard title not displayed for js method replaceall used here is not supported in chrome version under see the charts in dashboard will not display titles
1
19,060
13,536,130,521
IssuesEvent
2020-09-16 08:36:25
topcoder-platform/qa-fun
https://api.github.com/repos/topcoder-platform/qa-fun
closed
[Chrome] Data is twice on the CASE STUDY SUNSHOT CATALYST page
UX/Usability
Bug title - Data is repeated on the SUNSHOT CATALYST page and also some data is missing, which disturbs the user experience Steps To Reproduce - 1. Go to https://www.topcoder.com/case-studies/sunshot-catalyst/ 2. Scroll down and observe the data on page 3. you will see the THE BUSINESS IMPACT block is twice on the page Actual Result - the data is repeated on the page Expected Result - data should only available once on the page because if user is reading the page he will read it twice, it will consume user time Device/OS/Browser Information: Laptop HP, Windows10 (64Bit) , ChromeVersion 81.0.4044.129 ![Bug 16(A)](https://user-images.githubusercontent.com/42939505/81041653-fc435d80-8ecb-11ea-8ffe-a07e1e1f775e.png) ![Bug 16(B)](https://user-images.githubusercontent.com/42939505/81041657-fe0d2100-8ecb-11ea-80dd-ca0c938a7bdb.png)
True
[Chrome] Data is twice on the CASE STUDY SUNSHOT CATALYST page - Bug title - Data is repeated on the SUNSHOT CATALYST page and also some data is missing, which disturbs the user experience Steps To Reproduce - 1. Go to https://www.topcoder.com/case-studies/sunshot-catalyst/ 2. Scroll down and observe the data on page 3. you will see the THE BUSINESS IMPACT block is twice on the page Actual Result - the data is repeated on the page Expected Result - data should only available once on the page because if user is reading the page he will read it twice, it will consume user time Device/OS/Browser Information: Laptop HP, Windows10 (64Bit) , ChromeVersion 81.0.4044.129 ![Bug 16(A)](https://user-images.githubusercontent.com/42939505/81041653-fc435d80-8ecb-11ea-8ffe-a07e1e1f775e.png) ![Bug 16(B)](https://user-images.githubusercontent.com/42939505/81041657-fe0d2100-8ecb-11ea-80dd-ca0c938a7bdb.png)
usab
data is twice on the case study sunshot catalyst page bug title data is repeated on the sunshot catalyst page and also some data is missing which disturbs the user experience steps to reproduce go to scroll down and observe the data on page you will see the the business impact block is twice on the page actual result the data is repeated on the page expected result data should only available once on the page because if user is reading the page he will read it twice it will consume user time device os browser information laptop hp chromeversion
1
16,961
11,517,153,057
IssuesEvent
2020-02-14 07:34:54
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Have in editor options to simulate common forms of color vision deficiency (color blindness)
feature proposal topic:editor usability
A decent number of people suffer from being colorblind and as a result a lot of games are considerably harder or even impossible to play unless the devs specifically go out of their way to include colorblind modes. My suggestion is to have toggles in the editor that lets the developer see what their game looks like if they had deuteranomaly, protanomaly, or tritanomly. It'll help the devs who want to create color blind modes by letting them see what those users would see.
True
Have in editor options to simulate common forms of color vision deficiency (color blindness) - A decent number of people suffer from being colorblind and as a result a lot of games are considerably harder or even impossible to play unless the devs specifically go out of their way to include colorblind modes. My suggestion is to have toggles in the editor that lets the developer see what their game looks like if they had deuteranomaly, protanomaly, or tritanomly. It'll help the devs who want to create color blind modes by letting them see what those users would see.
usab
have in editor options to simulate common forms of color vision deficiency color blindness a decent number of people suffer from being colorblind and as a result a lot of games are considerably harder or even impossible to play unless the devs specifically go out of their way to include colorblind modes my suggestion is to have toggles in the editor that lets the developer see what their game looks like if they had deuteranomaly protanomaly or tritanomly it ll help the devs who want to create color blind modes by letting them see what those users would see
1
8,852
6,000,401,096
IssuesEvent
2017-06-05 04:43:40
streaka/plantguard-issues
https://api.github.com/repos/streaka/plantguard-issues
closed
Add new checklist field types: Single Select, Multi Select and Switch
Type: Usability
_Please fill out this template thoroughly, for the sake of our sanity._ ## **What did you expect to happen?** <!--- If you're describing a bug, tell us what should happen --> <!--- If you're suggesting a change/improvement, tell us how it should work --> ## **What actually happened?** <!--- If describing a bug, tell us what happens instead of the expected behavior --> <!--- If suggesting a change/improvement, explain the difference from current behavior --> ## **Steps to Reproduce Issue** (preferably video in Chrome with inspector open with console visible) ## **What platform was this on?** (Browser/iOS/Android)
True
Add new checklist field types: Single Select, Multi Select and Switch - _Please fill out this template thoroughly, for the sake of our sanity._ ## **What did you expect to happen?** <!--- If you're describing a bug, tell us what should happen --> <!--- If you're suggesting a change/improvement, tell us how it should work --> ## **What actually happened?** <!--- If describing a bug, tell us what happens instead of the expected behavior --> <!--- If suggesting a change/improvement, explain the difference from current behavior --> ## **Steps to Reproduce Issue** (preferably video in Chrome with inspector open with console visible) ## **What platform was this on?** (Browser/iOS/Android)
usab
add new checklist field types single select multi select and switch please fill out this template thoroughly for the sake of our sanity what did you expect to happen what actually happened steps to reproduce issue preferably video in chrome with inspector open with console visible what platform was this on browser ios android
1
6,065
4,147,567,758
IssuesEvent
2016-06-15 07:39:35
Virtual-Labs/computer-organization-iiith
https://api.github.com/repos/Virtual-Labs/computer-organization-iiith
closed
QA_Representation of Floating Point Numbers and their Arithmetic_Prerequisites_p1
Category: Usability Developed By: VLEAD Release Number: Production Severity: S2 Status : Differed Status: Open
Defect Description : In the "Representation of Floating Point Numbers and their Arithmetic " experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required. Actual Result : In the "Representation of Floating Point Numbers and their Arithmetic " experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0Bandwidth : 100Mbps Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/computer-organization-iiith/blob/master/test-cases/integration_test-cases/Representation%20of%20Floating%20Point%20Numbers%20and%20their%20Arithmetic/Representation%20of%20Floating%20Point%20Numbers%20and%20their%20Arithmetic_18_Prerequisites_p1.org Attachment: ![cso12](https://cloud.githubusercontent.com/assets/14869397/14778999/a26ab386-0af3-11e6-8f54-8474c58c2fe8.png)
True
QA_Representation of Floating Point Numbers and their Arithmetic_Prerequisites_p1 - Defect Description : In the "Representation of Floating Point Numbers and their Arithmetic " experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required. Actual Result : In the "Representation of Floating Point Numbers and their Arithmetic " experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Ubuntu-16.04,Centos-6 Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0Bandwidth : 100Mbps Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/computer-organization-iiith/blob/master/test-cases/integration_test-cases/Representation%20of%20Floating%20Point%20Numbers%20and%20their%20Arithmetic/Representation%20of%20Floating%20Point%20Numbers%20and%20their%20Arithmetic_18_Prerequisites_p1.org Attachment: ![cso12](https://cloud.githubusercontent.com/assets/14869397/14778999/a26ab386-0af3-11e6-8f54-8474c58c2fe8.png)
usab
qa representation of floating point numbers and their arithmetic prerequisites defect description in the representation of floating point numbers and their arithmetic experiment the minimum requirement to run the experiment is not displayed in the page instead a page or scrolling should appear providing information on minimum requirement to run this experiment information like bandwidth device resolution hardware configuration and software required actual result in the representation of floating point numbers and their arithmetic experiment the minimum requirement to run the experiment is not displayed in the page environment os windows ubuntu centos browsers firefox chrome chromium bandwidth hardware configuration processor test step link attachment
1
8,816
7,453,007,300
IssuesEvent
2018-03-29 10:18:50
subutai-io/cdn
https://api.github.com/repos/subutai-io/cdn
closed
PGP signing of APT repositories
DONE critical security
The apt repository needs to be signed by PGP key pair whose public part preloaded in the target system, so that apt on target system can authenticate the packages, resisting from MITM attacks. Ordinary secured APT repository consists of several components: - **Release** files: required, PGP signed, contains checksums of *Packages* and other indices. - **Packages** indices: required, index of binary packages, contains checksums of packages - **Sources** indices: optional, index of source packages, contains checksums of sources - **Contents** indices: optional, index of package contents - **Translation** indices: optional, localization of package descriptions Every time the repository maintenance software publishes a new set of packages (called a _dinstall_ run in dak), the *Packages* indices are refreshed and *Release* files are regenerated and detached signed (inline clear sign is supported, but for compatibility it's better to provide a Release.gpg that's detached signature alongside the unsigned Release file). All of the above meta files can be served as XZ compressed to speed up the process (with .xz suffix), and apt will fallback to .gz or .bz2 and finally plain text if they are not found (HTTP 404). For the checksums, SHA256 is recommended, and MD5Sum and SHA1 should present for compatibility reasons.
True
PGP signing of APT repositories - The apt repository needs to be signed by PGP key pair whose public part preloaded in the target system, so that apt on target system can authenticate the packages, resisting from MITM attacks. Ordinary secured APT repository consists of several components: - **Release** files: required, PGP signed, contains checksums of *Packages* and other indices. - **Packages** indices: required, index of binary packages, contains checksums of packages - **Sources** indices: optional, index of source packages, contains checksums of sources - **Contents** indices: optional, index of package contents - **Translation** indices: optional, localization of package descriptions Every time the repository maintenance software publishes a new set of packages (called a _dinstall_ run in dak), the *Packages* indices are refreshed and *Release* files are regenerated and detached signed (inline clear sign is supported, but for compatibility it's better to provide a Release.gpg that's detached signature alongside the unsigned Release file). All of the above meta files can be served as XZ compressed to speed up the process (with .xz suffix), and apt will fallback to .gz or .bz2 and finally plain text if they are not found (HTTP 404). For the checksums, SHA256 is recommended, and MD5Sum and SHA1 should present for compatibility reasons.
non_usab
pgp signing of apt repositories the apt repository needs to be signed by pgp key pair whose public part preloaded in the target system so that apt on target system can authenticate the packages resisting from mitm attacks ordinary secured apt repository consists of several components release files required pgp signed contains checksums of packages and other indices packages indices required index of binary packages contains checksums of packages sources indices optional index of source packages contains checksums of sources contents indices optional index of package contents translation indices optional localization of package descriptions every time the repository maintenance software publishes a new set of packages called a dinstall run in dak the packages indices are refreshed and release files are regenerated and detached signed inline clear sign is supported but for compatibility it s better to provide a release gpg that s detached signature alongside the unsigned release file all of the above meta files can be served as xz compressed to speed up the process with xz suffix and apt will fallback to gz or and finally plain text if they are not found http for the checksums is recommended and and should present for compatibility reasons
0
16,575
11,092,143,706
IssuesEvent
2019-12-15 17:00:44
visulate/visulate-for-oracle
https://api.github.com/repos/visulate/visulate-for-oracle
closed
Router links from main body should scroll to top of page
good first issue usability
Following a link in an object report e.g. FK or dependency should scroll to the top of the page.
True
Router links from main body should scroll to top of page - Following a link in an object report e.g. FK or dependency should scroll to the top of the page.
usab
router links from main body should scroll to top of page following a link in an object report e g fk or dependency should scroll to the top of the page
1
11,993
7,606,449,285
IssuesEvent
2018-04-30 13:25:14
scala/bug
https://api.github.com/repos/scala/bug
closed
Casts inserted when pattern matching for equality on singletons are unsound
has PR patmat should not compile usability
I've added a test to pending/run/castsingleton.scala: ```scala object Test extends Application { case class L(); object N extends L(); def empty(xs : L) : Unit = xs match { case x@N => println(x); println(x); } empty(L()) } ``` The problem is that the compiler inserts a cast of xs to N.type, which is unsound: The pattern match will succeed for any L, because N == L().
True
Casts inserted when pattern matching for equality on singletons are unsound - I've added a test to pending/run/castsingleton.scala: ```scala object Test extends Application { case class L(); object N extends L(); def empty(xs : L) : Unit = xs match { case x@N => println(x); println(x); } empty(L()) } ``` The problem is that the compiler inserts a cast of xs to N.type, which is unsound: The pattern match will succeed for any L, because N == L().
usab
casts inserted when pattern matching for equality on singletons are unsound i ve added a test to pending run castsingleton scala scala object test extends application case class l object n extends l def empty xs l unit xs match case x n println x println x empty l the problem is that the compiler inserts a cast of xs to n type which is unsound the pattern match will succeed for any l because n l
1
10,291
7,134,691,929
IssuesEvent
2018-01-22 21:45:31
tensorflow/models
https://api.github.com/repos/tensorflow/models
closed
from six.moves import xrange
stat:contributions welcome type:bug/performance
The following files are incompatible with Python 3 at least in part because they use xrange() yet they lack the import: * __from six.moves import xrange__ ``` ./research/compression/entropy_coder/dataset/gen_synthetic_dataset.py ./research/compression/entropy_coder/dataset/synthetic_model.py ./research/compression/entropy_coder/lib/blocks_masked_conv2d.py ./research/compression/entropy_coder/lib/blocks_masked_conv2d_test.py ./research/compression/entropy_coder/lib/blocks_std_test.py ./research/delf/delf/python/feature_io.py ./research/differential_privacy/dp_sgd/dp_mnist/dp_mnist.py ./research/differential_privacy/dp_sgd/per_example_gradients/per_example_gradients.py ./research/differential_privacy/multiple_teachers/aggregation.py ./research/differential_privacy/multiple_teachers/deep_cnn.py ./research/differential_privacy/multiple_teachers/input.py ./research/differential_privacy/multiple_teachers/train_student.py ./research/domain_adaptation/domain_separation/dsn_eval.py ./research/gan/cifar/util.py ./research/gan/image_compression/networks_test.py ./research/gan/mnist/util.py ./research/gan/mnist_estimator/train.py ./research/im2txt/im2txt/data/build_mscoco_data.py ./research/learned_optimizer/metaopt.py ./research/learning_to_remember_rare_events/data_utils.py ./research/learning_to_remember_rare_events/memory.py ./research/learning_to_remember_rare_events/train.py ./research/lfads/synth_data/generate_itb_data.py ./research/lfads/synth_data/generate_labeled_rnn_data.py ./research/neural_gpu/data_utils.py ./research/next_frame_prediction/cross_conv/eval.py ./research/next_frame_prediction/cross_conv/example_gen.py ./research/next_frame_prediction/cross_conv/model.py ./research/next_frame_prediction/cross_conv/reader.py ./research/next_frame_prediction/cross_conv/sprites_gen.py ./research/object_detection/dataset_tools/oid_tfrecord_creation.py ./research/object_detection/utils/np_box_list_ops.py ./research/pcl_rl/baseline.py ./research/pcl_rl/controller.py ./research/pcl_rl/env_spec.py ./research/pcl_rl/expert_paths.py ./research/pcl_rl/gym_wrapper.py ./research/pcl_rl/optimizers.py ./research/pcl_rl/replay_buffer.py ./research/pcl_rl/trainer.py ./research/pcl_rl/trust_region.py ./research/ptn/metrics.py ./research/ptn/model_ptn.py ./research/ptn/model_rotator.py ./research/ptn/model_voxel_generation.py ./research/ptn/pretrain_rotator.py ./research/ptn/utils.py ./research/real_nvp/real_nvp_utils.py ./research/slim/datasets/build_imagenet_data.py ./research/slim/datasets/preprocess_imagenet_validation_data.py ./research/slim/datasets/process_bounding_boxes.py ./research/slim/nets/cyclegan.py ./research/slim/nets/dcgan.py ./research/slim/nets/dcgan_test.py ./research/street/python/decoder.py ./research/street/python/shapes.py ./research/street/python/vgslspecs.py ./research/syntaxnet/dragnn/python/graph_builder_test.py ./research/syntaxnet/dragnn/python/network_units.py ./research/syntaxnet/dragnn/python/spec_builder.py ./research/syntaxnet/dragnn/python/trainer_lib.py ./research/tcn/labeled_eval.py ```
True
from six.moves import xrange - The following files are incompatible with Python 3 at least in part because they use xrange() yet they lack the import: * __from six.moves import xrange__ ``` ./research/compression/entropy_coder/dataset/gen_synthetic_dataset.py ./research/compression/entropy_coder/dataset/synthetic_model.py ./research/compression/entropy_coder/lib/blocks_masked_conv2d.py ./research/compression/entropy_coder/lib/blocks_masked_conv2d_test.py ./research/compression/entropy_coder/lib/blocks_std_test.py ./research/delf/delf/python/feature_io.py ./research/differential_privacy/dp_sgd/dp_mnist/dp_mnist.py ./research/differential_privacy/dp_sgd/per_example_gradients/per_example_gradients.py ./research/differential_privacy/multiple_teachers/aggregation.py ./research/differential_privacy/multiple_teachers/deep_cnn.py ./research/differential_privacy/multiple_teachers/input.py ./research/differential_privacy/multiple_teachers/train_student.py ./research/domain_adaptation/domain_separation/dsn_eval.py ./research/gan/cifar/util.py ./research/gan/image_compression/networks_test.py ./research/gan/mnist/util.py ./research/gan/mnist_estimator/train.py ./research/im2txt/im2txt/data/build_mscoco_data.py ./research/learned_optimizer/metaopt.py ./research/learning_to_remember_rare_events/data_utils.py ./research/learning_to_remember_rare_events/memory.py ./research/learning_to_remember_rare_events/train.py ./research/lfads/synth_data/generate_itb_data.py ./research/lfads/synth_data/generate_labeled_rnn_data.py ./research/neural_gpu/data_utils.py ./research/next_frame_prediction/cross_conv/eval.py ./research/next_frame_prediction/cross_conv/example_gen.py ./research/next_frame_prediction/cross_conv/model.py ./research/next_frame_prediction/cross_conv/reader.py ./research/next_frame_prediction/cross_conv/sprites_gen.py ./research/object_detection/dataset_tools/oid_tfrecord_creation.py ./research/object_detection/utils/np_box_list_ops.py ./research/pcl_rl/baseline.py ./research/pcl_rl/controller.py ./research/pcl_rl/env_spec.py ./research/pcl_rl/expert_paths.py ./research/pcl_rl/gym_wrapper.py ./research/pcl_rl/optimizers.py ./research/pcl_rl/replay_buffer.py ./research/pcl_rl/trainer.py ./research/pcl_rl/trust_region.py ./research/ptn/metrics.py ./research/ptn/model_ptn.py ./research/ptn/model_rotator.py ./research/ptn/model_voxel_generation.py ./research/ptn/pretrain_rotator.py ./research/ptn/utils.py ./research/real_nvp/real_nvp_utils.py ./research/slim/datasets/build_imagenet_data.py ./research/slim/datasets/preprocess_imagenet_validation_data.py ./research/slim/datasets/process_bounding_boxes.py ./research/slim/nets/cyclegan.py ./research/slim/nets/dcgan.py ./research/slim/nets/dcgan_test.py ./research/street/python/decoder.py ./research/street/python/shapes.py ./research/street/python/vgslspecs.py ./research/syntaxnet/dragnn/python/graph_builder_test.py ./research/syntaxnet/dragnn/python/network_units.py ./research/syntaxnet/dragnn/python/spec_builder.py ./research/syntaxnet/dragnn/python/trainer_lib.py ./research/tcn/labeled_eval.py ```
non_usab
from six moves import xrange the following files are incompatible with python at least in part because they use xrange yet they lack the import from six moves import xrange research compression entropy coder dataset gen synthetic dataset py research compression entropy coder dataset synthetic model py research compression entropy coder lib blocks masked py research compression entropy coder lib blocks masked test py research compression entropy coder lib blocks std test py research delf delf python feature io py research differential privacy dp sgd dp mnist dp mnist py research differential privacy dp sgd per example gradients per example gradients py research differential privacy multiple teachers aggregation py research differential privacy multiple teachers deep cnn py research differential privacy multiple teachers input py research differential privacy multiple teachers train student py research domain adaptation domain separation dsn eval py research gan cifar util py research gan image compression networks test py research gan mnist util py research gan mnist estimator train py research data build mscoco data py research learned optimizer metaopt py research learning to remember rare events data utils py research learning to remember rare events memory py research learning to remember rare events train py research lfads synth data generate itb data py research lfads synth data generate labeled rnn data py research neural gpu data utils py research next frame prediction cross conv eval py research next frame prediction cross conv example gen py research next frame prediction cross conv model py research next frame prediction cross conv reader py research next frame prediction cross conv sprites gen py research object detection dataset tools oid tfrecord creation py research object detection utils np box list ops py research pcl rl baseline py research pcl rl controller py research pcl rl env spec py research pcl rl expert paths py research pcl rl gym wrapper py research pcl rl optimizers py research pcl rl replay buffer py research pcl rl trainer py research pcl rl trust region py research ptn metrics py research ptn model ptn py research ptn model rotator py research ptn model voxel generation py research ptn pretrain rotator py research ptn utils py research real nvp real nvp utils py research slim datasets build imagenet data py research slim datasets preprocess imagenet validation data py research slim datasets process bounding boxes py research slim nets cyclegan py research slim nets dcgan py research slim nets dcgan test py research street python decoder py research street python shapes py research street python vgslspecs py research syntaxnet dragnn python graph builder test py research syntaxnet dragnn python network units py research syntaxnet dragnn python spec builder py research syntaxnet dragnn python trainer lib py research tcn labeled eval py
0
11,586
7,309,411,317
IssuesEvent
2018-02-28 11:40:52
kubernetes/kops
https://api.github.com/repos/kubernetes/kops
closed
Validate User Permissions
area/usability lifecycle/rotten
In order to provide a better user experience, kops should do a pre-flight check that the user has permissions to create a cluster. 1. verify that user has permissions matching https://gist.github.com/chrislovecnm/7c5b57e675e41fefc172f5dc8e9287c9 for AWS 2. verify that the user has correct permissions for GCE 3. verify that user has correct permissions for vSphere 4. help user with the capability to create a role that has correct permissions
True
Validate User Permissions - In order to provide a better user experience, kops should do a pre-flight check that the user has permissions to create a cluster. 1. verify that user has permissions matching https://gist.github.com/chrislovecnm/7c5b57e675e41fefc172f5dc8e9287c9 for AWS 2. verify that the user has correct permissions for GCE 3. verify that user has correct permissions for vSphere 4. help user with the capability to create a role that has correct permissions
usab
validate user permissions in order to provide a better user experience kops should do a pre flight check that the user has permissions to create a cluster verify that user has permissions matching for aws verify that the user has correct permissions for gce verify that user has correct permissions for vsphere help user with the capability to create a role that has correct permissions
1
23,884
7,430,683,842
IssuesEvent
2018-03-25 05:29:34
Semantic-Org/Semantic-UI
https://api.github.com/repos/Semantic-Org/Semantic-UI
closed
Run gulp build multiple times in a Custom Pipeline
Build Tools Discussion Evaluating Bug / Change stale
Hi there, thanks for your awesome work on SemanticUI! I was following the instructions to integrate the gulp build process into my [own gulp pipeline](https://semantic-ui.com/introduction/advanced-usage.html) and had the need to execute the build task multiple times for multiple themes. This failed because the build function pushes the tasks onto an [array](https://github.com/Semantic-Org/Semantic-UI/blob/next/tasks/build.js#L15) every time the function gets executed and thus runSequence stops with `Task build-javascript is listed more than once. This is probably a typo.` To fix this I moved the `tasks` array into the actual build function. Is there another way of doing this? If not should I open a pull request with [the change](https://github.com/stoically/Semantic-UI/commit/a73d0e47bfb8ec8387dcf8031e929de58f66a1c2)? Best regards
1.0
Run gulp build multiple times in a Custom Pipeline - Hi there, thanks for your awesome work on SemanticUI! I was following the instructions to integrate the gulp build process into my [own gulp pipeline](https://semantic-ui.com/introduction/advanced-usage.html) and had the need to execute the build task multiple times for multiple themes. This failed because the build function pushes the tasks onto an [array](https://github.com/Semantic-Org/Semantic-UI/blob/next/tasks/build.js#L15) every time the function gets executed and thus runSequence stops with `Task build-javascript is listed more than once. This is probably a typo.` To fix this I moved the `tasks` array into the actual build function. Is there another way of doing this? If not should I open a pull request with [the change](https://github.com/stoically/Semantic-UI/commit/a73d0e47bfb8ec8387dcf8031e929de58f66a1c2)? Best regards
non_usab
run gulp build multiple times in a custom pipeline hi there thanks for your awesome work on semanticui i was following the instructions to integrate the gulp build process into my and had the need to execute the build task multiple times for multiple themes this failed because the build function pushes the tasks onto an every time the function gets executed and thus runsequence stops with task build javascript is listed more than once this is probably a typo to fix this i moved the tasks array into the actual build function is there another way of doing this if not should i open a pull request with best regards
0
7,001
3,073,058,341
IssuesEvent
2015-08-19 20:01:01
NETponents/octoduino
https://api.github.com/repos/NETponents/octoduino
opened
GSM module
c++ documentation enhancement hard help wanted long_term octoduino_core parser
If this has enough support, will add it through PB. Output channel seems a little too much for SMS.
1.0
GSM module - If this has enough support, will add it through PB. Output channel seems a little too much for SMS.
non_usab
gsm module if this has enough support will add it through pb output channel seems a little too much for sms
0
44,096
2,899,135,377
IssuesEvent
2015-06-17 09:25:18
greenlion/PHP-SQL-Parser
https://api.github.com/repos/greenlion/PHP-SQL-Parser
closed
Working with escaped values
bug imported Priority-Medium
_From [lucian.t...@gmail.com](https://code.google.com/u/100584422996267738909/) on March 30, 2012 15:55:06_ What steps will reproduce the problem? 1. The following two queries won't be exploded consistently: $q1 = "select a from t where x = \"a'b\\cd\" and y = 'ef\"gh'"; $q2 = "select a from t where x = \"abcd\" and y = 'efgh'"; $parser = new CPHPSQLParser(); print_r($parser->parse($q1)); print_r($parser->parse($q2)); What is the expected output? What do you see instead? $q1 and $q2 should have in the [WHERE] part two colref/operator/const parts, with only different constant/colref values. Instead, the [WHERE] for q1 has only one colref/operator/const part, like this: colref = "x", operator = "=" , const "a'b\cd" and y = 'ef"gh' What version of the product are you using? On what operating system? tags/2012-03-23 / win7 Please provide any additional information below. The parser doesn't correctly handle values with escaped chars inside (no matter if the value is single or double quoted inside the query). _Original issue: http://code.google.com/p/php-sql-parser/issues/detail?id=40_
1.0
Working with escaped values - _From [lucian.t...@gmail.com](https://code.google.com/u/100584422996267738909/) on March 30, 2012 15:55:06_ What steps will reproduce the problem? 1. The following two queries won't be exploded consistently: $q1 = "select a from t where x = \"a'b\\cd\" and y = 'ef\"gh'"; $q2 = "select a from t where x = \"abcd\" and y = 'efgh'"; $parser = new CPHPSQLParser(); print_r($parser->parse($q1)); print_r($parser->parse($q2)); What is the expected output? What do you see instead? $q1 and $q2 should have in the [WHERE] part two colref/operator/const parts, with only different constant/colref values. Instead, the [WHERE] for q1 has only one colref/operator/const part, like this: colref = "x", operator = "=" , const "a'b\cd" and y = 'ef"gh' What version of the product are you using? On what operating system? tags/2012-03-23 / win7 Please provide any additional information below. The parser doesn't correctly handle values with escaped chars inside (no matter if the value is single or double quoted inside the query). _Original issue: http://code.google.com/p/php-sql-parser/issues/detail?id=40_
non_usab
working with escaped values from on march what steps will reproduce the problem the following two queries won t be exploded consistently select a from t where x a b cd and y ef gh select a from t where x abcd and y efgh parser new cphpsqlparser print r parser parse print r parser parse what is the expected output what do you see instead and should have in the part two colref operator const parts with only different constant colref values instead the for has only one colref operator const part like this colref x operator const a b cd and y ef gh what version of the product are you using on what operating system tags please provide any additional information below the parser doesn t correctly handle values with escaped chars inside no matter if the value is single or double quoted inside the query original issue
0
267,178
20,192,949,341
IssuesEvent
2022-02-11 07:55:15
cengage/react-magma
https://api.github.com/repos/cengage/react-magma
opened
Documentation> Dropzone> "View Design Guidelines" button is not appearing to navigate back from API page to Guidelines page.
documentation
**Describe the bug** Button "View Design Guidelines" is not apearing when clicks on button "View Component API". **To Reproduce** Steps to reproduce the behavior: 1. Go to https://react-magma.cengage.com/version/2.5.7/design/dropzone/ 2. Click on "View Component API" button 3. Verify that "View Design Guidelines" button is not appearing to navigate back from API page to Guidelines page **Expected behavior** Button "View Design Guidelines" should appear when clicks on button "View Component API" **Screenshots** https://somup.com/c3nQXVZN4k **Desktop (please complete the following information):** - OS: [Win 10] - Browser [chrome] - Version [Chrome version: Version 98.0.4758.81 (Official Build) (64-bit)] **Additional context** We have observed that "Dropzone" is appearing under Design guidelines and API both. When we clicks on "View Component API" button, the API page appears. However, user is unable to navigate back to Guidelines page as "View Design Guidelines" button is missing.
1.0
Documentation> Dropzone> "View Design Guidelines" button is not appearing to navigate back from API page to Guidelines page. - **Describe the bug** Button "View Design Guidelines" is not apearing when clicks on button "View Component API". **To Reproduce** Steps to reproduce the behavior: 1. Go to https://react-magma.cengage.com/version/2.5.7/design/dropzone/ 2. Click on "View Component API" button 3. Verify that "View Design Guidelines" button is not appearing to navigate back from API page to Guidelines page **Expected behavior** Button "View Design Guidelines" should appear when clicks on button "View Component API" **Screenshots** https://somup.com/c3nQXVZN4k **Desktop (please complete the following information):** - OS: [Win 10] - Browser [chrome] - Version [Chrome version: Version 98.0.4758.81 (Official Build) (64-bit)] **Additional context** We have observed that "Dropzone" is appearing under Design guidelines and API both. When we clicks on "View Component API" button, the API page appears. However, user is unable to navigate back to Guidelines page as "View Design Guidelines" button is missing.
non_usab
documentation dropzone view design guidelines button is not appearing to navigate back from api page to guidelines page describe the bug button view design guidelines is not apearing when clicks on button view component api to reproduce steps to reproduce the behavior go to click on view component api button verify that view design guidelines button is not appearing to navigate back from api page to guidelines page expected behavior button view design guidelines should appear when clicks on button view component api screenshots desktop please complete the following information os browser version additional context we have observed that dropzone is appearing under design guidelines and api both when we clicks on view component api button the api page appears however user is unable to navigate back to guidelines page as view design guidelines button is missing
0
19,096
13,536,134,157
IssuesEvent
2020-09-16 08:36:45
topcoder-platform/qa-fun
https://api.github.com/repos/topcoder-platform/qa-fun
closed
The menu title at the top navbar not highlighted when hovering the mouse
UX/Usability
**Steps to Reproduce:** 1. Go to URL https://www.topcoder.com/ 2. Hovering the mouse on the menu title at the top navbar 3. Observe **Expected Result:** The menu title should be highlighted when hovering the mouse over the texts just like happen on the profile tab **Actual Result:** The menu title not highlighted when hovering the mouse over the texts **Screenshots or screencast:** ![Bug3](https://user-images.githubusercontent.com/39216496/81036126-7b3a9500-8ed0-11ea-8ba1-660dd15fc3b9.png) [Bug3.zip](https://github.com/topcoder-platform/qa-fun/files/4578974/Bug3.zip) **Device/OS/Browser Information:** Laptop/macOS-Catalina/Chrome Version 81.0.4044.129
True
The menu title at the top navbar not highlighted when hovering the mouse - **Steps to Reproduce:** 1. Go to URL https://www.topcoder.com/ 2. Hovering the mouse on the menu title at the top navbar 3. Observe **Expected Result:** The menu title should be highlighted when hovering the mouse over the texts just like happen on the profile tab **Actual Result:** The menu title not highlighted when hovering the mouse over the texts **Screenshots or screencast:** ![Bug3](https://user-images.githubusercontent.com/39216496/81036126-7b3a9500-8ed0-11ea-8ba1-660dd15fc3b9.png) [Bug3.zip](https://github.com/topcoder-platform/qa-fun/files/4578974/Bug3.zip) **Device/OS/Browser Information:** Laptop/macOS-Catalina/Chrome Version 81.0.4044.129
usab
the menu title at the top navbar not highlighted when hovering the mouse steps to reproduce go to url hovering the mouse on the menu title at the top navbar observe expected result the menu title should be highlighted when hovering the mouse over the texts just like happen on the profile tab actual result the menu title not highlighted when hovering the mouse over the texts screenshots or screencast device os browser information laptop macos catalina chrome version
1
5,232
3,906,053,034
IssuesEvent
2016-04-19 07:06:20
kolliSuman/issues
https://api.github.com/repos/kolliSuman/issues
closed
QA_Millikans Experiment_Prerequisites_p1
Category: Usability Developed By: VLEAD Release Number: Production Severity: S2 Status: Open
Defect Description : In the "Millikans Experiment " experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required.. Actual Result : In the "Millikans Experiment " experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Linux Browsers: Firefox,Chrome Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/physical-sciences-iiith/blob/master/test-cases/integration_test-cases/Millikans%20Experiment/Millikans%20Experiment_16_Prerequisites_p1.org
True
QA_Millikans Experiment_Prerequisites_p1 - Defect Description : In the "Millikans Experiment " experiment, the minimum requirement to run the experiment is not displayed in the page instead a page or Scrolling should appear providing information on minimum requirement to run this experiment, information like Bandwidth,Device Resolution,Hardware Configuration and Software Required.. Actual Result : In the "Millikans Experiment " experiment, the minimum requirement to run the experiment is not displayed in the page. Environment : OS: Windows 7, Linux Browsers: Firefox,Chrome Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/physical-sciences-iiith/blob/master/test-cases/integration_test-cases/Millikans%20Experiment/Millikans%20Experiment_16_Prerequisites_p1.org
usab
qa millikans experiment prerequisites defect description in the millikans experiment experiment the minimum requirement to run the experiment is not displayed in the page instead a page or scrolling should appear providing information on minimum requirement to run this experiment information like bandwidth device resolution hardware configuration and software required actual result in the millikans experiment experiment the minimum requirement to run the experiment is not displayed in the page environment os windows linux browsers firefox chrome bandwidth hardware configuration processor test step link
1
50,856
3,007,465,308
IssuesEvent
2015-07-27 16:11:37
Ombridride/minetest-minetestforfun-server
https://api.github.com/repos/Ombridride/minetest-minetestforfun-server
closed
Mobs bugs/tweaks
Modding@BugFix Priority@Medium
#### Bugs/tweaks about monsters/animals - [X] The cow animlal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/cow.lua#L7 => **Cows become aggressive, the current lag just delays the action a bit** (Mg) - [X] The warthog animal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/warthog.lua#L7 => **Same problem as the cows** (Mg) - [X] The goat animal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/goat.lua#L7 => **Same problem as the warthog** (Mg) - [X] The Big/medium/small Green Slime monster have a knockback, we need to remove it https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/greenslimes.lua#L140 => The knockback is already set to 0, but the jump of the mob is again "stopped" when we attack him... => **See below** (Mg) - [X] The Big/medium/small Lava Slime monster have a knockback, we need to remove it AND add a lava_bucket drop for the Small Lava Slime https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/lavaslimes.lua#L147 => The knockback is already set to 0, but the jump of the mob is again "stopped" when we attack him... => **Same for all the slimes, and all monsters with little jumps. We can't do anything, it's from the engine.** (Mg) - [x] The Creeper monster have a too bigknockback, we need to reduce it => the default knockback was modified accidently to 3, now re-set to 1, so this tweak isn't needed https://github.com/Ombridride/minetest-minetestforfun-server/commit/07f78821fe85888a7d35314b041c241298c2cdbd - [x] Add super_apple drops to the Tree Monster https://github.com/Ombridride/minetest-minetestforfun-server/commit/07f78821fe85888a7d35314b041c241298c2cdbd - [X] 27/07/15 => the hitbox of the __cow__ animal isn't good anymore, need to make it bigger and rectangular => https://github.com/Ombridride/minetest-minetestforfun-server/commit/f0ace0faad4a87f6efe6bf1e60bece933ebd9291
1.0
Mobs bugs/tweaks - #### Bugs/tweaks about monsters/animals - [X] The cow animlal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/cow.lua#L7 => **Cows become aggressive, the current lag just delays the action a bit** (Mg) - [X] The warthog animal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/warthog.lua#L7 => **Same problem as the cows** (Mg) - [X] The goat animal doesn't become agressive anymore when you attack them => the mobs .lua code seems to be ok https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/goat.lua#L7 => **Same problem as the warthog** (Mg) - [X] The Big/medium/small Green Slime monster have a knockback, we need to remove it https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/greenslimes.lua#L140 => The knockback is already set to 0, but the jump of the mob is again "stopped" when we attack him... => **See below** (Mg) - [X] The Big/medium/small Lava Slime monster have a knockback, we need to remove it AND add a lava_bucket drop for the Small Lava Slime https://github.com/Ombridride/minetest-minetestforfun-server/blob/master/mods/mobs/lavaslimes.lua#L147 => The knockback is already set to 0, but the jump of the mob is again "stopped" when we attack him... => **Same for all the slimes, and all monsters with little jumps. We can't do anything, it's from the engine.** (Mg) - [x] The Creeper monster have a too bigknockback, we need to reduce it => the default knockback was modified accidently to 3, now re-set to 1, so this tweak isn't needed https://github.com/Ombridride/minetest-minetestforfun-server/commit/07f78821fe85888a7d35314b041c241298c2cdbd - [x] Add super_apple drops to the Tree Monster https://github.com/Ombridride/minetest-minetestforfun-server/commit/07f78821fe85888a7d35314b041c241298c2cdbd - [X] 27/07/15 => the hitbox of the __cow__ animal isn't good anymore, need to make it bigger and rectangular => https://github.com/Ombridride/minetest-minetestforfun-server/commit/f0ace0faad4a87f6efe6bf1e60bece933ebd9291
non_usab
mobs bugs tweaks bugs tweaks about monsters animals the cow animlal doesn t become agressive anymore when you attack them the mobs lua code seems to be ok cows become aggressive the current lag just delays the action a bit mg the warthog animal doesn t become agressive anymore when you attack them the mobs lua code seems to be ok same problem as the cows mg the goat animal doesn t become agressive anymore when you attack them the mobs lua code seems to be ok same problem as the warthog mg the big medium small green slime monster have a knockback we need to remove it the knockback is already set to but the jump of the mob is again stopped when we attack him see below mg the big medium small lava slime monster have a knockback we need to remove it and add a lava bucket drop for the small lava slime the knockback is already set to but the jump of the mob is again stopped when we attack him same for all the slimes and all monsters with little jumps we can t do anything it s from the engine mg the creeper monster have a too bigknockback we need to reduce it the default knockback was modified accidently to now re set to so this tweak isn t needed add super apple drops to the tree monster the hitbox of the cow animal isn t good anymore need to make it bigger and rectangular
0
200,345
7,006,144,542
IssuesEvent
2017-12-19 07:01:11
wso2/product-is
https://api.github.com/repos/wso2/product-is
closed
[Chaos]During the Graceful restart ldap error is printed eventhough the user store is JDBC
Affected/5.4.0 Priority/High
[Chaos]During the Graceful restart ldap error is printed eventhough the user store is JDBC. In 2 node cluster while the script is run to login with 100 threads, 1 loop and 100 s rampup, when I click on the graceful restart button in management console the node that was actively serving was shutting down and restarting. But the below exception is printed in the backend. User store is JDBC (MySQL). The below error is misleading and cannot understand how it is printed. `TID: [-1234] [] [2017-12-19 06:15:32,312] INFO {org.wso2.carbon.ldap.server.configuration.LDAPConfigurationBuilder} - KDC server is disabled. TID: [-1234] [] [2017-12-19 06:15:32,511] ERROR {org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader} - ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. TID: [-1234] [] [2017-12-19 06:15:32,511] ERROR {org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader} - ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. TID: [-1234] [] [2017-12-19 06:15:32,512] ERROR {org.wso2.carbon.apacheds.impl.ApacheLDAPServer} - LDAP server initialization failed. org.wso2.carbon.ldap.server.exception.DirectoryServerException: Can not start the Default apacheds service at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:165) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:94) at org.wso2.carbon.ldap.server.DirectoryActivator.startLdapServer(DirectoryActivator.java:192) at org.wso2.carbon.ldap.server.DirectoryActivator.start(DirectoryActivator.java:78) at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381) at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390) at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544) at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457) at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1) at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230) at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340) Caused by: java.io.FileNotFoundException: ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. at org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader.<init>(LdifSchemaLoader.java:113) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.initSchema(CarbonDirectoryServiceFactory.java:163) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.build(CarbonDirectoryServiceFactory.java:217) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.init(CarbonDirectoryServiceFactory.java:119) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:162) ... 18 more TID: [-1234] [] [2017-12-19 06:15:32,516] ERROR {org.wso2.carbon.ldap.server.DirectoryActivator} - Could not start the embedded-ldap. org.wso2.carbon.ldap.server.exception.DirectoryServerException: Error initializing ApacheLDAPServer. at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:102) at org.wso2.carbon.ldap.server.DirectoryActivator.startLdapServer(DirectoryActivator.java:192) at org.wso2.carbon.ldap.server.DirectoryActivator.start(DirectoryActivator.java:78) at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381) at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390) at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544) at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457) at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1) at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230) at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340) Caused by: org.wso2.carbon.ldap.server.exception.DirectoryServerException: Can not start the Default apacheds service at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:165) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:94) ... 17 more Caused by: java.io.FileNotFoundException: ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. at org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader.<init>(LdifSchemaLoader.java:113) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.initSchema(CarbonDirectoryServiceFactory.java:163) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.build(CarbonDirectoryServiceFactory.java:217) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.init(CarbonDirectoryServiceFactory.java:119) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:162) ... 18 more TID: [-1] [] [2017-12-19 06:15:32,552] INFO {org.wso2.carbon.mex.internal.Office365SupportMexComponent} - Office365Support MexServiceComponent bundle activated successfully.. TID: [-1] [] [2017-12-19 06:15:32,579] INFO {org.wso2.carbon.mex2.internal.DynamicCRMCustomMexComponent} - DynamicCRMSupport MexServiceComponent bundle activated successfully. `
1.0
[Chaos]During the Graceful restart ldap error is printed eventhough the user store is JDBC - [Chaos]During the Graceful restart ldap error is printed eventhough the user store is JDBC. In 2 node cluster while the script is run to login with 100 threads, 1 loop and 100 s rampup, when I click on the graceful restart button in management console the node that was actively serving was shutting down and restarting. But the below exception is printed in the backend. User store is JDBC (MySQL). The below error is misleading and cannot understand how it is printed. `TID: [-1234] [] [2017-12-19 06:15:32,312] INFO {org.wso2.carbon.ldap.server.configuration.LDAPConfigurationBuilder} - KDC server is disabled. TID: [-1234] [] [2017-12-19 06:15:32,511] ERROR {org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader} - ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. TID: [-1234] [] [2017-12-19 06:15:32,511] ERROR {org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader} - ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. TID: [-1234] [] [2017-12-19 06:15:32,512] ERROR {org.wso2.carbon.apacheds.impl.ApacheLDAPServer} - LDAP server initialization failed. org.wso2.carbon.ldap.server.exception.DirectoryServerException: Can not start the Default apacheds service at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:165) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:94) at org.wso2.carbon.ldap.server.DirectoryActivator.startLdapServer(DirectoryActivator.java:192) at org.wso2.carbon.ldap.server.DirectoryActivator.start(DirectoryActivator.java:78) at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381) at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390) at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544) at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457) at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1) at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230) at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340) Caused by: java.io.FileNotFoundException: ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. at org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader.<init>(LdifSchemaLoader.java:113) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.initSchema(CarbonDirectoryServiceFactory.java:163) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.build(CarbonDirectoryServiceFactory.java:217) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.init(CarbonDirectoryServiceFactory.java:119) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:162) ... 18 more TID: [-1234] [] [2017-12-19 06:15:32,516] ERROR {org.wso2.carbon.ldap.server.DirectoryActivator} - Could not start the embedded-ldap. org.wso2.carbon.ldap.server.exception.DirectoryServerException: Error initializing ApacheLDAPServer. at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:102) at org.wso2.carbon.ldap.server.DirectoryActivator.startLdapServer(DirectoryActivator.java:192) at org.wso2.carbon.ldap.server.DirectoryActivator.start(DirectoryActivator.java:78) at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381) at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390) at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559) at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544) at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457) at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438) at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1) at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230) at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340) Caused by: org.wso2.carbon.ldap.server.exception.DirectoryServerException: Can not start the Default apacheds service at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:165) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.init(ApacheLDAPServer.java:94) ... 17 more Caused by: java.io.FileNotFoundException: ERR_10004 Expecting to find a schema.ldif file in provided baseDirectory path '/home/ubuntu/packs/ga/wso2is-5.4.0_cluster/repository/data/org.wso2.carbon.directory/schema/ou=schema.ldif' but no such file found. at org.apache.directory.shared.ldap.schema.loader.ldif.LdifSchemaLoader.<init>(LdifSchemaLoader.java:113) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.initSchema(CarbonDirectoryServiceFactory.java:163) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.build(CarbonDirectoryServiceFactory.java:217) at org.wso2.carbon.apacheds.impl.CarbonDirectoryServiceFactory.init(CarbonDirectoryServiceFactory.java:119) at org.wso2.carbon.apacheds.impl.ApacheLDAPServer.initializeDefaultDirectoryService(ApacheLDAPServer.java:162) ... 18 more TID: [-1] [] [2017-12-19 06:15:32,552] INFO {org.wso2.carbon.mex.internal.Office365SupportMexComponent} - Office365Support MexServiceComponent bundle activated successfully.. TID: [-1] [] [2017-12-19 06:15:32,579] INFO {org.wso2.carbon.mex2.internal.DynamicCRMCustomMexComponent} - DynamicCRMSupport MexServiceComponent bundle activated successfully. `
non_usab
during the graceful restart ldap error is printed eventhough the user store is jdbc during the graceful restart ldap error is printed eventhough the user store is jdbc in node cluster while the script is run to login with threads loop and s rampup when i click on the graceful restart button in management console the node that was actively serving was shutting down and restarting but the below exception is printed in the backend user store is jdbc mysql the below error is misleading and cannot understand how it is printed tid info org carbon ldap server configuration ldapconfigurationbuilder kdc server is disabled tid error org apache directory shared ldap schema loader ldif ldifschemaloader err expecting to find a schema ldif file in provided basedirectory path home ubuntu packs ga cluster repository data org carbon directory schema ou schema ldif but no such file found tid error org apache directory shared ldap schema loader ldif ldifschemaloader err expecting to find a schema ldif file in provided basedirectory path home ubuntu packs ga cluster repository data org carbon directory schema ou schema ldif but no such file found tid error org carbon apacheds impl apacheldapserver ldap server initialization failed org carbon ldap server exception directoryserverexception can not start the default apacheds service at org carbon apacheds impl apacheldapserver initializedefaultdirectoryservice apacheldapserver java at org carbon apacheds impl apacheldapserver init apacheldapserver java at org carbon ldap server directoryactivator startldapserver directoryactivator java at org carbon ldap server directoryactivator start directoryactivator java at org eclipse osgi framework internal core bundlecontextimpl run bundlecontextimpl java at java security accesscontroller doprivileged native method at org eclipse osgi framework internal core bundlecontextimpl startactivator bundlecontextimpl java at org eclipse osgi framework internal core bundlecontextimpl start bundlecontextimpl java at org eclipse osgi framework internal core bundlehost startworker bundlehost java at org eclipse osgi framework internal core abstractbundle resume abstractbundle java at org eclipse osgi framework internal core framework resumebundle framework java at org eclipse osgi framework internal core startlevelmanager resumebundles startlevelmanager java at org eclipse osgi framework internal core startlevelmanager resumebundles startlevelmanager java at org eclipse osgi framework internal core startlevelmanager incfwsl startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dosetstartlevel startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dispatchevent startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dispatchevent startlevelmanager java at org eclipse osgi framework eventmgr eventmanager dispatchevent eventmanager java at org eclipse osgi framework eventmgr eventmanager eventthread run eventmanager java caused by java io filenotfoundexception err expecting to find a schema ldif file in provided basedirectory path home ubuntu packs ga cluster repository data org carbon directory schema ou schema ldif but no such file found at org apache directory shared ldap schema loader ldif ldifschemaloader ldifschemaloader java at org carbon apacheds impl carbondirectoryservicefactory initschema carbondirectoryservicefactory java at org carbon apacheds impl carbondirectoryservicefactory build carbondirectoryservicefactory java at org carbon apacheds impl carbondirectoryservicefactory init carbondirectoryservicefactory java at org carbon apacheds impl apacheldapserver initializedefaultdirectoryservice apacheldapserver java more tid error org carbon ldap server directoryactivator could not start the embedded ldap org carbon ldap server exception directoryserverexception error initializing apacheldapserver at org carbon apacheds impl apacheldapserver init apacheldapserver java at org carbon ldap server directoryactivator startldapserver directoryactivator java at org carbon ldap server directoryactivator start directoryactivator java at org eclipse osgi framework internal core bundlecontextimpl run bundlecontextimpl java at java security accesscontroller doprivileged native method at org eclipse osgi framework internal core bundlecontextimpl startactivator bundlecontextimpl java at org eclipse osgi framework internal core bundlecontextimpl start bundlecontextimpl java at org eclipse osgi framework internal core bundlehost startworker bundlehost java at org eclipse osgi framework internal core abstractbundle resume abstractbundle java at org eclipse osgi framework internal core framework resumebundle framework java at org eclipse osgi framework internal core startlevelmanager resumebundles startlevelmanager java at org eclipse osgi framework internal core startlevelmanager resumebundles startlevelmanager java at org eclipse osgi framework internal core startlevelmanager incfwsl startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dosetstartlevel startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dispatchevent startlevelmanager java at org eclipse osgi framework internal core startlevelmanager dispatchevent startlevelmanager java at org eclipse osgi framework eventmgr eventmanager dispatchevent eventmanager java at org eclipse osgi framework eventmgr eventmanager eventthread run eventmanager java caused by org carbon ldap server exception directoryserverexception can not start the default apacheds service at org carbon apacheds impl apacheldapserver initializedefaultdirectoryservice apacheldapserver java at org carbon apacheds impl apacheldapserver init apacheldapserver java more caused by java io filenotfoundexception err expecting to find a schema ldif file in provided basedirectory path home ubuntu packs ga cluster repository data org carbon directory schema ou schema ldif but no such file found at org apache directory shared ldap schema loader ldif ldifschemaloader ldifschemaloader java at org carbon apacheds impl carbondirectoryservicefactory initschema carbondirectoryservicefactory java at org carbon apacheds impl carbondirectoryservicefactory build carbondirectoryservicefactory java at org carbon apacheds impl carbondirectoryservicefactory init carbondirectoryservicefactory java at org carbon apacheds impl apacheldapserver initializedefaultdirectoryservice apacheldapserver java more tid info org carbon mex internal mexservicecomponent bundle activated successfully tid info org carbon internal dynamiccrmcustommexcomponent dynamiccrmsupport mexservicecomponent bundle activated successfully
0
31,712
7,434,858,206
IssuesEvent
2018-03-26 12:32:47
zeebe-io/zeebe
https://api.github.com/repos/zeebe-io/zeebe
closed
SegFault in MembershipDecoder
bug code gossip
In executing GossipClusteringTests repeatedly I saw some seg faults like the following: ```java Stack: [0x00007f482cfe3000,0x00007f482d0e4000], sp=0x00007f482d0e2290, free space=1020k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) J 2582 C1 org.agrona.concurrent.UnsafeBuffer.getByte(I)B (29 bytes) @ 0x00007f486976f97a [0x00007f486976f8c0+0xba] J 2649 C1 io.zeebe.clustering.gossip.GossipEventDecoder$MembershipEventsDecoder.eventType()Lio/zeebe/clustering/gossip/MembershipEventType; (24 bytes) @ 0x00007f486979ed0c [0x00007f486979ec00+0x10c] j io.zeebe.gossip.protocol.GossipEvent.wrap(Lorg/agrona/DirectBuffer;II)V+142 j io.zeebe.gossip.protocol.GossipRequestHandler.onRequest(Lio/zeebe/transport/ServerOutput;Lio/zeebe/transport/RemoteAddress;Lorg/agrona/DirectBuffer;IIJ)Z+50 j io.zeebe.transport.impl.ServerReceiveHandler.onFragment(Lorg/agrona/DirectBuffer;IIIZ)I+125 j io.zeebe.dispatcher.Subscription.pollFragments(Lio/zeebe/dispatcher/impl/log/LogBufferPartition;Lio/zeebe/dispatcher/FragmentHandler;IIIJZ)I+134 J 2215 C2 io.zeebe.gossip.failuredetection.SubscriptionController$PollSubscriptionState.doWork(Lio/zeebe/util/state/StateMachineContext;)I (9 bytes) @ 0x00007f48696907e0 [0x00007f4869690680+0x160] J 1441 C2 io.zeebe.gossip.Gossip.doWork()I (74 bytes) @ 0x00007f4869478dfc [0x00007f4869478cc0+0x13c] J 2717 C2 io.zeebe.util.actor.ActorRunner.doWork()I (183 bytes) @ 0x00007f48697df324 [0x00007f48697df0c0+0x264] J 1254% C2 io.zeebe.util.actor.ActorRunner.doWorkUntilClose()V (35 bytes) @ 0x00007f48693f5b34 [0x00007f48693f5a80+0xb4] j io.zeebe.util.actor.ActorRunner$$Lambda$7.run()V+4 j io.zeebe.util.LogUtil.doWithMDC(Ljava/util/Map;Ljava/lang/Runnable;)V+9 j io.zeebe.util.actor.ActorRunner.run()V+10 j java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V+95 j java.util.concurrent.ThreadPoolExecutor$Worker.run()V+5 j java.lang.Thread.run()V+11 v ~StubRoutines::call_stub V [libjvm.so+0x67d0db] V [libjvm.so+0x67a79b] V [libjvm.so+0x67ad87] V [libjvm.so+0x6bd70b] V [libjvm.so+0xa415c3] V [libjvm.so+0xa41928] V [libjvm.so+0x8de152] C [libpthread.so.0+0x708c] start_thread+0xdc ```
1.0
SegFault in MembershipDecoder - In executing GossipClusteringTests repeatedly I saw some seg faults like the following: ```java Stack: [0x00007f482cfe3000,0x00007f482d0e4000], sp=0x00007f482d0e2290, free space=1020k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) J 2582 C1 org.agrona.concurrent.UnsafeBuffer.getByte(I)B (29 bytes) @ 0x00007f486976f97a [0x00007f486976f8c0+0xba] J 2649 C1 io.zeebe.clustering.gossip.GossipEventDecoder$MembershipEventsDecoder.eventType()Lio/zeebe/clustering/gossip/MembershipEventType; (24 bytes) @ 0x00007f486979ed0c [0x00007f486979ec00+0x10c] j io.zeebe.gossip.protocol.GossipEvent.wrap(Lorg/agrona/DirectBuffer;II)V+142 j io.zeebe.gossip.protocol.GossipRequestHandler.onRequest(Lio/zeebe/transport/ServerOutput;Lio/zeebe/transport/RemoteAddress;Lorg/agrona/DirectBuffer;IIJ)Z+50 j io.zeebe.transport.impl.ServerReceiveHandler.onFragment(Lorg/agrona/DirectBuffer;IIIZ)I+125 j io.zeebe.dispatcher.Subscription.pollFragments(Lio/zeebe/dispatcher/impl/log/LogBufferPartition;Lio/zeebe/dispatcher/FragmentHandler;IIIJZ)I+134 J 2215 C2 io.zeebe.gossip.failuredetection.SubscriptionController$PollSubscriptionState.doWork(Lio/zeebe/util/state/StateMachineContext;)I (9 bytes) @ 0x00007f48696907e0 [0x00007f4869690680+0x160] J 1441 C2 io.zeebe.gossip.Gossip.doWork()I (74 bytes) @ 0x00007f4869478dfc [0x00007f4869478cc0+0x13c] J 2717 C2 io.zeebe.util.actor.ActorRunner.doWork()I (183 bytes) @ 0x00007f48697df324 [0x00007f48697df0c0+0x264] J 1254% C2 io.zeebe.util.actor.ActorRunner.doWorkUntilClose()V (35 bytes) @ 0x00007f48693f5b34 [0x00007f48693f5a80+0xb4] j io.zeebe.util.actor.ActorRunner$$Lambda$7.run()V+4 j io.zeebe.util.LogUtil.doWithMDC(Ljava/util/Map;Ljava/lang/Runnable;)V+9 j io.zeebe.util.actor.ActorRunner.run()V+10 j java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V+95 j java.util.concurrent.ThreadPoolExecutor$Worker.run()V+5 j java.lang.Thread.run()V+11 v ~StubRoutines::call_stub V [libjvm.so+0x67d0db] V [libjvm.so+0x67a79b] V [libjvm.so+0x67ad87] V [libjvm.so+0x6bd70b] V [libjvm.so+0xa415c3] V [libjvm.so+0xa41928] V [libjvm.so+0x8de152] C [libpthread.so.0+0x708c] start_thread+0xdc ```
non_usab
segfault in membershipdecoder in executing gossipclusteringtests repeatedly i saw some seg faults like the following java stack sp free space native frames j compiled java code j interpreted vv vm code c native code j org agrona concurrent unsafebuffer getbyte i b bytes j io zeebe clustering gossip gossipeventdecoder membershipeventsdecoder eventtype lio zeebe clustering gossip membershipeventtype bytes j io zeebe gossip protocol gossipevent wrap lorg agrona directbuffer ii v j io zeebe gossip protocol gossiprequesthandler onrequest lio zeebe transport serveroutput lio zeebe transport remoteaddress lorg agrona directbuffer iij z j io zeebe transport impl serverreceivehandler onfragment lorg agrona directbuffer iiiz i j io zeebe dispatcher subscription pollfragments lio zeebe dispatcher impl log logbufferpartition lio zeebe dispatcher fragmenthandler iiijz i j io zeebe gossip failuredetection subscriptioncontroller pollsubscriptionstate dowork lio zeebe util state statemachinecontext i bytes j io zeebe gossip gossip dowork i bytes j io zeebe util actor actorrunner dowork i bytes j io zeebe util actor actorrunner doworkuntilclose v bytes j io zeebe util actor actorrunner lambda run v j io zeebe util logutil dowithmdc ljava util map ljava lang runnable v j io zeebe util actor actorrunner run v j java util concurrent threadpoolexecutor runworker ljava util concurrent threadpoolexecutor worker v j java util concurrent threadpoolexecutor worker run v j java lang thread run v v stubroutines call stub v v v v v v v c start thread
0
2,818
3,200,976,748
IssuesEvent
2015-10-02 01:28:13
piwik/piwik
https://api.github.com/repos/piwik/piwik
closed
remove extra line break and indentation in Admin UI
Bug c: Usability
See attached files, the text should not wrap. 1) Settings > Geolocate, the text wraps on next line after this radio: ![extra_wrapping_on_geolocate_button](https://cloud.githubusercontent.com/assets/466765/4605529/cc51fc70-51e7-11e4-9e7f-40015f5c3f1c.png) 2) In Scheduled report, scheduled to mobile phone, when having defined 2+ phone numbers, they appear as wrapped: ![wrapping](https://cloud.githubusercontent.com/assets/466765/4605530/e398a64a-51e7-11e4-9587-c15620ba543c.png)
True
remove extra line break and indentation in Admin UI - See attached files, the text should not wrap. 1) Settings > Geolocate, the text wraps on next line after this radio: ![extra_wrapping_on_geolocate_button](https://cloud.githubusercontent.com/assets/466765/4605529/cc51fc70-51e7-11e4-9e7f-40015f5c3f1c.png) 2) In Scheduled report, scheduled to mobile phone, when having defined 2+ phone numbers, they appear as wrapped: ![wrapping](https://cloud.githubusercontent.com/assets/466765/4605530/e398a64a-51e7-11e4-9587-c15620ba543c.png)
usab
remove extra line break and indentation in admin ui see attached files the text should not wrap settings geolocate the text wraps on next line after this radio in scheduled report scheduled to mobile phone when having defined phone numbers they appear as wrapped
1
5,299
3,917,126,333
IssuesEvent
2016-04-21 06:44:05
kolliSuman/issues
https://api.github.com/repos/kolliSuman/issues
closed
QA_Counters_Back to experiment_smk
Category: Usability Developed By: VLEAD Release Number: Production Severity: S2 Status: Open
Defect Description : In "Counters " experiment,there is no back to experiments link instead back to experiments link should be there inorder to view the list of experiments Actual Result : In "Counters " experiment,the back to experiments link is not present Environment : OS: Windows 7, Linux Browsers: Firefox,Chrome Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/digital-logic-design-iiith/blob/master/test-cases/integration_test-cases/Counters/Counters_28_Back%20to%20experiment_smk.org
True
QA_Counters_Back to experiment_smk - Defect Description : In "Counters " experiment,there is no back to experiments link instead back to experiments link should be there inorder to view the list of experiments Actual Result : In "Counters " experiment,the back to experiments link is not present Environment : OS: Windows 7, Linux Browsers: Firefox,Chrome Bandwidth : 100Mbps Hardware Configuration:8GBRAM , Processor:i5 Test Step Link: https://github.com/Virtual-Labs/digital-logic-design-iiith/blob/master/test-cases/integration_test-cases/Counters/Counters_28_Back%20to%20experiment_smk.org
usab
qa counters back to experiment smk defect description in counters experiment there is no back to experiments link instead back to experiments link should be there inorder to view the list of experiments actual result in counters experiment the back to experiments link is not present environment os windows linux browsers firefox chrome bandwidth hardware configuration processor test step link
1
1,892
6,894,596,379
IssuesEvent
2017-11-23 10:35:21
openshiftio/openshift.io
https://api.github.com/repos/openshiftio/openshift.io
closed
fabric8.io pipelines stability
area/architecture area/pipelines kind/bug SEV1-urgent team/build-cd team/service-delivery
The build pipelines, running on fabric8.io, go down too often and do not have enough support to keep them running for devs around the world.
1.0
fabric8.io pipelines stability - The build pipelines, running on fabric8.io, go down too often and do not have enough support to keep them running for devs around the world.
non_usab
io pipelines stability the build pipelines running on io go down too often and do not have enough support to keep them running for devs around the world
0
25,982
26,222,413,489
IssuesEvent
2023-01-04 15:49:14
solo-io/gloo
https://api.github.com/repos/solo-io/gloo
closed
Feat: Expose envoy's`config.filter.accesslog.v3.AccessLogFilter`
Type: Enhancement Area: Usability Good First Issue
### Use-case I want to enable Access Logs for tracking but the number of logs is overwhelming. I'd like to use `config.filter.accesslog.v3.RuntimeFilter` to log just a percentage of the logs https://www.envoyproxy.io/docs/envoy/latest/api-v3/config/accesslog/v3/accesslog.proto#envoy-v3-api-msg-config-accesslog-v3-runtimefilter ### Feature request Expose envoy's `config.filter.accesslog.v3.RuntimeFilter` in Gloo's Gateway CRD Something like this: ```diff apiVersion: gateway.solo.io/v1 kind: Gateway metadata: annotations: origin: default name: gateway namespace: gloo-system spec: bindAddress: '::' bindPort: 8080 proxyNames: - gateway-proxy httpGateway: {} useProxyProto: false options: accessLoggingService: accessLog: - fileSink: path: /dev/stdout + filter: + runtime_filter: + runtime_key: somekey + percent_sampled: + numerator: "..." + denominator: "..." + use_independent_randomness": true stringFormat: "" ```
True
Feat: Expose envoy's`config.filter.accesslog.v3.AccessLogFilter` - ### Use-case I want to enable Access Logs for tracking but the number of logs is overwhelming. I'd like to use `config.filter.accesslog.v3.RuntimeFilter` to log just a percentage of the logs https://www.envoyproxy.io/docs/envoy/latest/api-v3/config/accesslog/v3/accesslog.proto#envoy-v3-api-msg-config-accesslog-v3-runtimefilter ### Feature request Expose envoy's `config.filter.accesslog.v3.RuntimeFilter` in Gloo's Gateway CRD Something like this: ```diff apiVersion: gateway.solo.io/v1 kind: Gateway metadata: annotations: origin: default name: gateway namespace: gloo-system spec: bindAddress: '::' bindPort: 8080 proxyNames: - gateway-proxy httpGateway: {} useProxyProto: false options: accessLoggingService: accessLog: - fileSink: path: /dev/stdout + filter: + runtime_filter: + runtime_key: somekey + percent_sampled: + numerator: "..." + denominator: "..." + use_independent_randomness": true stringFormat: "" ```
usab
feat expose envoy s config filter accesslog accesslogfilter use case i want to enable access logs for tracking but the number of logs is overwhelming i d like to use config filter accesslog runtimefilter to log just a percentage of the logs feature request expose envoy s config filter accesslog runtimefilter in gloo s gateway crd something like this diff apiversion gateway solo io kind gateway metadata annotations origin default name gateway namespace gloo system spec bindaddress bindport proxynames gateway proxy httpgateway useproxyproto false options accessloggingservice accesslog filesink path dev stdout filter runtime filter runtime key somekey percent sampled numerator denominator use independent randomness true stringformat
1
9,339
6,229,557,024
IssuesEvent
2017-07-11 04:31:51
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Using Freelook and Zoom at the same time changes zoom max and min values
discussion topic:editor usability
**Operating system or device - Godot version:** Linux Mint 18.1 - Godot 3.0 custom build **Issue description:** <!-- What happened, and what was expected. --> When you hold down the RMB and Scroll, the extent that you can zoom changes. This also changes the speed at which you can zoom. **Steps to reproduce:** 1) Go to a 3d scene with something to look at as a reference point 2) hold down the right mouse button and scroll until you are to the minimum value for zoom the very helpful scroll bar in the scene view should not be filled 3) let go of the right mouse button 4) keep using the scroll wheel
True
Using Freelook and Zoom at the same time changes zoom max and min values - **Operating system or device - Godot version:** Linux Mint 18.1 - Godot 3.0 custom build **Issue description:** <!-- What happened, and what was expected. --> When you hold down the RMB and Scroll, the extent that you can zoom changes. This also changes the speed at which you can zoom. **Steps to reproduce:** 1) Go to a 3d scene with something to look at as a reference point 2) hold down the right mouse button and scroll until you are to the minimum value for zoom the very helpful scroll bar in the scene view should not be filled 3) let go of the right mouse button 4) keep using the scroll wheel
usab
using freelook and zoom at the same time changes zoom max and min values operating system or device godot version linux mint godot custom build issue description when you hold down the rmb and scroll the extent that you can zoom changes this also changes the speed at which you can zoom steps to reproduce go to a scene with something to look at as a reference point hold down the right mouse button and scroll until you are to the minimum value for zoom the very helpful scroll bar in the scene view should not be filled let go of the right mouse button keep using the scroll wheel
1
4,491
3,870,171,168
IssuesEvent
2016-04-11 01:01:43
lionheart/openradar-mirror
https://api.github.com/repos/lionheart/openradar-mirror
opened
23012666: Add icon option to UITableViewRowAction
classification:ui/usability reproducible:always status:open
#### Description Mail includes icons for itโ€™s actions. However, the equivalent UI in UIKit, UITableViewRowAction only has an option for a title, not an icon. In many cases, an icon could add to the context of the title, or even replace it. - Product Version: 9.1.0 Created: 2015-10-07T17:21:23.121050 Originated: 2015-10-07T10:21:00 Open Radar Link: http://www.openradar.me/23012666
True
23012666: Add icon option to UITableViewRowAction - #### Description Mail includes icons for itโ€™s actions. However, the equivalent UI in UIKit, UITableViewRowAction only has an option for a title, not an icon. In many cases, an icon could add to the context of the title, or even replace it. - Product Version: 9.1.0 Created: 2015-10-07T17:21:23.121050 Originated: 2015-10-07T10:21:00 Open Radar Link: http://www.openradar.me/23012666
usab
add icon option to uitableviewrowaction description mail includes icons for itโ€™s actions however the equivalent ui in uikit uitableviewrowaction only has an option for a title not an icon in many cases an icon could add to the context of the title or even replace it product version created originated open radar link
1
745,304
25,979,173,452
IssuesEvent
2022-12-19 17:12:55
dotnet/aspnetcore
https://api.github.com/repos/dotnet/aspnetcore
closed
Document how to configure HttpClient base address in Blazor Server using IHttpClientFactory
Docs area-blazor affected-few severity-major Priority:1 blazor-server
I am trying to configure `HttpClient`'s base address in a Blazor Server using `IHttpClientFactory` but I am getting a runtime exception: services.AddHttpClient("ApiClient", (provider, client) => { var uriHelper = provider.GetRequiredService<NavigationManager>(); client.BaseAddress = new Uri(uriHelper.BaseUri); }); Create an instance of this HttpClient using IHttpClientFactory and the following exception will be thrown: `_httpClient = httpClientFactory.CreateClient("ApiClient");` `System.InvalidOperationException: 'Cannot resolve scoped service 'Microsoft.AspNetCore.Components.NavigationManager' from root provider.'` ![image](https://user-images.githubusercontent.com/549272/92745245-dd2d6e80-f38a-11ea-8f6e-e0cb6d787506.png) ``` NET SDK (reflecting any global.json): Version: 5.0.100-preview.8.20417.9 Commit: fc62663a35 Runtime Environment: OS Name: Windows OS Version: 10.0.18363 OS Platform: Windows RID: win10-x64 Base Path: C:\Program Files\dotnet\sdk\5.0.100-preview.8.20417.9\ Host (useful for support): Version: 5.0.0-preview.8.20407.11 Commit: bf456654f9 .NET SDKs installed: 3.1.401 [C:\Program Files\dotnet\sdk] 5.0.100-preview.8.20417.9 [C:\Program Files\dotnet\sdk] .NET runtimes installed: Microsoft.AspNetCore.All 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All] Microsoft.AspNetCore.App 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 5.0.0-preview.8.20414.8 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.NETCore.App 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 5.0.0-preview.8.20407.11 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.WindowsDesktop.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] Microsoft.WindowsDesktop.App 5.0.0-preview.8.20411.6 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] To install additional .NET runtimes or SDKs: https://aka.ms/dotnet-download ``` Anyone knows what might be the issue here?
1.0
Document how to configure HttpClient base address in Blazor Server using IHttpClientFactory - I am trying to configure `HttpClient`'s base address in a Blazor Server using `IHttpClientFactory` but I am getting a runtime exception: services.AddHttpClient("ApiClient", (provider, client) => { var uriHelper = provider.GetRequiredService<NavigationManager>(); client.BaseAddress = new Uri(uriHelper.BaseUri); }); Create an instance of this HttpClient using IHttpClientFactory and the following exception will be thrown: `_httpClient = httpClientFactory.CreateClient("ApiClient");` `System.InvalidOperationException: 'Cannot resolve scoped service 'Microsoft.AspNetCore.Components.NavigationManager' from root provider.'` ![image](https://user-images.githubusercontent.com/549272/92745245-dd2d6e80-f38a-11ea-8f6e-e0cb6d787506.png) ``` NET SDK (reflecting any global.json): Version: 5.0.100-preview.8.20417.9 Commit: fc62663a35 Runtime Environment: OS Name: Windows OS Version: 10.0.18363 OS Platform: Windows RID: win10-x64 Base Path: C:\Program Files\dotnet\sdk\5.0.100-preview.8.20417.9\ Host (useful for support): Version: 5.0.0-preview.8.20407.11 Commit: bf456654f9 .NET SDKs installed: 3.1.401 [C:\Program Files\dotnet\sdk] 5.0.100-preview.8.20417.9 [C:\Program Files\dotnet\sdk] .NET runtimes installed: Microsoft.AspNetCore.All 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All] Microsoft.AspNetCore.App 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.AspNetCore.App 5.0.0-preview.8.20414.8 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App] Microsoft.NETCore.App 2.1.21 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.NETCore.App 5.0.0-preview.8.20407.11 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App] Microsoft.WindowsDesktop.App 3.1.7 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] Microsoft.WindowsDesktop.App 5.0.0-preview.8.20411.6 [C:\Program Files\dotnet\shared\Microsoft.WindowsDesktop.App] To install additional .NET runtimes or SDKs: https://aka.ms/dotnet-download ``` Anyone knows what might be the issue here?
non_usab
document how to configure httpclient base address in blazor server using ihttpclientfactory i am trying to configure httpclient s base address in a blazor server using ihttpclientfactory but i am getting a runtime exception services addhttpclient apiclient provider client var urihelper provider getrequiredservice client baseaddress new uri urihelper baseuri create an instance of this httpclient using ihttpclientfactory and the following exception will be thrown httpclient httpclientfactory createclient apiclient system invalidoperationexception cannot resolve scoped service microsoft aspnetcore components navigationmanager from root provider net sdk reflecting any global json version preview commit runtime environment os name windows os version os platform windows rid base path c program files dotnet sdk preview host useful for support version preview commit net sdks installed preview net runtimes installed microsoft aspnetcore all microsoft aspnetcore app microsoft aspnetcore app microsoft aspnetcore app preview microsoft netcore app microsoft netcore app microsoft netcore app preview microsoft windowsdesktop app microsoft windowsdesktop app preview to install additional net runtimes or sdks anyone knows what might be the issue here
0
7,138
16,669,882,611
IssuesEvent
2021-06-07 09:31:17
mbecker12/surface-rl-decoder
https://api.github.com/repos/mbecker12/surface-rl-decoder
closed
Implement Transfer Learning for 3D Conv Networks
infrastructure network architecture q learning
We want to investigate if transfer learning from smaller systems to larger systems is beneficial. For that, implement the functionality to load a pretrained d=5 3D Conv model and apply it to a model capable of decoding at d=7,9,....
1.0
Implement Transfer Learning for 3D Conv Networks - We want to investigate if transfer learning from smaller systems to larger systems is beneficial. For that, implement the functionality to load a pretrained d=5 3D Conv model and apply it to a model capable of decoding at d=7,9,....
non_usab
implement transfer learning for conv networks we want to investigate if transfer learning from smaller systems to larger systems is beneficial for that implement the functionality to load a pretrained d conv model and apply it to a model capable of decoding at d
0
467,546
13,450,357,341
IssuesEvent
2020-09-08 18:23:55
Plant-for-the-Planet/planet-webapp
https://api.github.com/repos/Plant-for-the-Planet/planet-webapp
closed
Load Zoom 2 from URL Parameters
Priority enhancement
Zoom1 -Zoom2; all links take users to project on flyto Zoom2 append` /?p=slug-name` Can include site id as well if it exists; it should be ignored if siteID does not match
1.0
Load Zoom 2 from URL Parameters - Zoom1 -Zoom2; all links take users to project on flyto Zoom2 append` /?p=slug-name` Can include site id as well if it exists; it should be ignored if siteID does not match
non_usab
load zoom from url parameters all links take users to project on flyto append p slug name can include site id as well if it exists it should be ignored if siteid does not match
0
8,774
5,961,505,021
IssuesEvent
2017-05-29 17:43:40
twidi/github-issues-manager
https://api.github.com/repos/twidi/github-issues-manager
closed
Reduce visibility of ws connection alerts
component:usability component:websocket speed:1:easy-picking type:enhancement workflow:2:working
At least when everything is OK. For example nothing if connection happen in the first 5 seconds. Also manage the disconnect / reconnect when we are leaving the page
True
Reduce visibility of ws connection alerts - At least when everything is OK. For example nothing if connection happen in the first 5 seconds. Also manage the disconnect / reconnect when we are leaving the page
usab
reduce visibility of ws connection alerts at least when everything is ok for example nothing if connection happen in the first seconds also manage the disconnect reconnect when we are leaving the page
1
27,272
27,987,192,428
IssuesEvent
2023-03-26 20:27:05
julianmichael/debate
https://api.github.com/repos/julianmichael/debate
closed
Improved judge interface for seeing quotes consolidated
usability
We could give the judge a view of the story which _only_ shows the quotes that have been used during the debate, and gives a sense of their relative positions in the story (with <snip>s, surrogate black rectangles for redacted text, etc. in between the quotes). This would make it easier for the judge to see all of the evidence consolidated together.
True
Improved judge interface for seeing quotes consolidated - We could give the judge a view of the story which _only_ shows the quotes that have been used during the debate, and gives a sense of their relative positions in the story (with <snip>s, surrogate black rectangles for redacted text, etc. in between the quotes). This would make it easier for the judge to see all of the evidence consolidated together.
usab
improved judge interface for seeing quotes consolidated we could give the judge a view of the story which only shows the quotes that have been used during the debate and gives a sense of their relative positions in the story with s surrogate black rectangles for redacted text etc in between the quotes this would make it easier for the judge to see all of the evidence consolidated together
1
11,801
7,464,985,110
IssuesEvent
2018-04-02 00:42:50
pypa/warehouse
https://api.github.com/repos/pypa/warehouse
closed
Improved error message when attempting upload to Warehouse via twine with an invalid trove classifier
APIs/feeds help needed usability
I just now uploaded a new release to PyPI. Initially, I had made a typo in one of my trove classifiers (`Topic :: Testing` instead of `Topic :: Software Development :: Testing`). The error returned by `twine` when I attempted to upload to Warehouse was mostly unhelpful, other than indicating something was wrong with my classifiers: ``` HTTPError: 400 Client Error: ['License :: OSI Approved :: MIT License', 'Natural Language :: English', 'Intended Audience :: Developers', 'Operating System :: OS Independent', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3.4', 'Programming Languag for url: https://upload.pypi.org/legacy/ ``` The error returned by `twine` when I attempted an upload to `testpypi`, though, pointed me (eventually) to the problematic classifier: ``` HTTPError: 400 Client Error: ['License :: OSI Approved :: MIT License', 'Natural Language :: English', 'Intended Audience :: Developers', 'Operating System :: OS Independent', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Testing', 'Development Status :: 4 - Beta'] is an invalid value for Classifier. Error: 'Topic :: Testing' is not a valid choice for this field see https://packaging.python.org/specifications/core-metadata for url: https://test.pypi.org/legacy/ ``` A more succinct, targeted error message here would be hugely helpful. FWIW, it almost looks like Warehouse tried to generate the same message as legacy, but it got clipped?
True
Improved error message when attempting upload to Warehouse via twine with an invalid trove classifier - I just now uploaded a new release to PyPI. Initially, I had made a typo in one of my trove classifiers (`Topic :: Testing` instead of `Topic :: Software Development :: Testing`). The error returned by `twine` when I attempted to upload to Warehouse was mostly unhelpful, other than indicating something was wrong with my classifiers: ``` HTTPError: 400 Client Error: ['License :: OSI Approved :: MIT License', 'Natural Language :: English', 'Intended Audience :: Developers', 'Operating System :: OS Independent', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3.4', 'Programming Languag for url: https://upload.pypi.org/legacy/ ``` The error returned by `twine` when I attempted an upload to `testpypi`, though, pointed me (eventually) to the problematic classifier: ``` HTTPError: 400 Client Error: ['License :: OSI Approved :: MIT License', 'Natural Language :: English', 'Intended Audience :: Developers', 'Operating System :: OS Independent', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Topic :: Software Development :: Libraries :: Python Modules', 'Topic :: Testing', 'Development Status :: 4 - Beta'] is an invalid value for Classifier. Error: 'Topic :: Testing' is not a valid choice for this field see https://packaging.python.org/specifications/core-metadata for url: https://test.pypi.org/legacy/ ``` A more succinct, targeted error message here would be hugely helpful. FWIW, it almost looks like Warehouse tried to generate the same message as legacy, but it got clipped?
usab
improved error message when attempting upload to warehouse via twine with an invalid trove classifier i just now uploaded a new release to pypi initially i had made a typo in one of my trove classifiers topic testing instead of topic software development testing the error returned by twine when i attempted to upload to warehouse was mostly unhelpful other than indicating something was wrong with my classifiers httperror client error license osi approved mit license natural language english intended audience developers operating system os independent programming language python only programming language python programming languag for url the error returned by twine when i attempted an upload to testpypi though pointed me eventually to the problematic classifier httperror client error is an invalid value for classifier error topic testing is not a valid choice for this field see for url a more succinct targeted error message here would be hugely helpful fwiw it almost looks like warehouse tried to generate the same message as legacy but it got clipped
1
61,883
6,760,998,782
IssuesEvent
2017-10-24 22:59:12
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
[e2e test failure] [sig-storage] [Serial] Volume metrics should create prometheus metrics for volume provisioning and attach/detach
kind/bug kind/e2e-test-failure milestone/needs-approval priority/critical-urgent sig/storage
The test added in https://github.com/kubernetes/kubernetes/pull/52807 has been extremely flaky since it's introduction, and is messing with our e2e test signal. https://k8s-testgrid.appspot.com/release-master-blocking#gke-serial https://k8s-testgrid.appspot.com/release-master-blocking#gci-gce-serial https://k8s-testgrid.appspot.com/release-master-blocking#gce-serial https://k8s-testgrid.appspot.com/release-master-blocking#gci-gke-serial Since the test was only added yesterday would is be okay to revert it for now? cc @gnufied @jeffvance @nikhiljindal cc @kubernetes/sig-storage-test-failures cc @kubernetes/kubernetes-release-managers
1.0
[e2e test failure] [sig-storage] [Serial] Volume metrics should create prometheus metrics for volume provisioning and attach/detach - The test added in https://github.com/kubernetes/kubernetes/pull/52807 has been extremely flaky since it's introduction, and is messing with our e2e test signal. https://k8s-testgrid.appspot.com/release-master-blocking#gke-serial https://k8s-testgrid.appspot.com/release-master-blocking#gci-gce-serial https://k8s-testgrid.appspot.com/release-master-blocking#gce-serial https://k8s-testgrid.appspot.com/release-master-blocking#gci-gke-serial Since the test was only added yesterday would is be okay to revert it for now? cc @gnufied @jeffvance @nikhiljindal cc @kubernetes/sig-storage-test-failures cc @kubernetes/kubernetes-release-managers
non_usab
volume metrics should create prometheus metrics for volume provisioning and attach detach the test added in has been extremely flaky since it s introduction and is messing with our test signal since the test was only added yesterday would is be okay to revert it for now cc gnufied jeffvance nikhiljindal cc kubernetes sig storage test failures cc kubernetes kubernetes release managers
0
4,303
3,810,194,665
IssuesEvent
2016-03-26 00:19:41
sandstorm-io/sandstorm
https://api.github.com/repos/sandstorm-io/sandstorm
closed
auto sign-out
usability
https://youtu.be/OH7iaiYLeeU ![image](https://cloud.githubusercontent.com/assets/1584904/9538765/a89837a2-4cfc-11e5-87a3-469f83160f91.png) If I sign out while a grain is open, then sign back in immediately, I get immediately signed out.
True
auto sign-out - https://youtu.be/OH7iaiYLeeU ![image](https://cloud.githubusercontent.com/assets/1584904/9538765/a89837a2-4cfc-11e5-87a3-469f83160f91.png) If I sign out while a grain is open, then sign back in immediately, I get immediately signed out.
usab
auto sign out if i sign out while a grain is open then sign back in immediately i get immediately signed out
1
75,992
9,912,865,144
IssuesEvent
2019-06-28 10:05:03
bjorn/tiled
https://api.github.com/repos/bjorn/tiled
closed
Add the Castle Game Engine to frameworks list that support tmx format
documentation
Add please a Castle Game Engine (Object Pascal language) to frameworks list. https://castle-engine.io
1.0
Add the Castle Game Engine to frameworks list that support tmx format - Add please a Castle Game Engine (Object Pascal language) to frameworks list. https://castle-engine.io
non_usab
add the castle game engine to frameworks list that support tmx format add please a castle game engine object pascal language to frameworks list
0
27,101
27,657,346,918
IssuesEvent
2023-03-12 04:57:53
lgarron/first-world
https://api.github.com/repos/lgarron/first-world
reopened
PROMISE Pegasus32 R8 causes ridiculous system-wide freezes on macOS
Type: Bug Reason: Usability Level: Severe Workaround: None Hardware: Promise Pegasus32 R8
(Using the PROMISE Pegasus2 RAID array on macOS 12 requires a kernel extension. Those are deprecated, and for good reason.) EDIT: The PROMISE Pegasus32 RAID array uses a dext. This is better, but I still consider it borderline unacceptable for a drive that advertises itself as Thunderbolt 3 and USB 3.2 compatible. I don't want a custom high-touch racecar, I want a reliable station wagon full of tapes. When I have my RAID array plugged in, some file system operation will freeze for *minutes* at a time and seemingly halt every program that is accessing any disk on the file system. Even if that program is only accessing files on the internal SSD. So basically everything except the cursor (and some occasional UI elements) freeze up, for minutes at a time. If I'm lucky, it unfreezes in 1 or maybe 10 minutes. Otherwise, I have to pull the cable manually (and sometimes hard reboot the RAID array itself). I also have to guess whether the cause was purely the RAID array or possibly related to something else, like our corporate spyware or other bad macOS behaviour when connected to slow/SMB drives. Combined with #70, that doesn't inspire a lot of confidence in reliability for getting back my data in the future. (I keep multiple backups, but I still expect a RAID array PROMISE to not be this broken, given how much Apple seems to position them as the premium "just works" option for Apple computers.)
True
PROMISE Pegasus32 R8 causes ridiculous system-wide freezes on macOS - (Using the PROMISE Pegasus2 RAID array on macOS 12 requires a kernel extension. Those are deprecated, and for good reason.) EDIT: The PROMISE Pegasus32 RAID array uses a dext. This is better, but I still consider it borderline unacceptable for a drive that advertises itself as Thunderbolt 3 and USB 3.2 compatible. I don't want a custom high-touch racecar, I want a reliable station wagon full of tapes. When I have my RAID array plugged in, some file system operation will freeze for *minutes* at a time and seemingly halt every program that is accessing any disk on the file system. Even if that program is only accessing files on the internal SSD. So basically everything except the cursor (and some occasional UI elements) freeze up, for minutes at a time. If I'm lucky, it unfreezes in 1 or maybe 10 minutes. Otherwise, I have to pull the cable manually (and sometimes hard reboot the RAID array itself). I also have to guess whether the cause was purely the RAID array or possibly related to something else, like our corporate spyware or other bad macOS behaviour when connected to slow/SMB drives. Combined with #70, that doesn't inspire a lot of confidence in reliability for getting back my data in the future. (I keep multiple backups, but I still expect a RAID array PROMISE to not be this broken, given how much Apple seems to position them as the premium "just works" option for Apple computers.)
usab
promise causes ridiculous system wide freezes on macos using the promise raid array on macos requires a kernel extension those are deprecated and for good reason edit the promise raid array uses a dext this is better but i still consider it borderline unacceptable for a drive that advertises itself as thunderbolt and usb compatible i don t want a custom high touch racecar i want a reliable station wagon full of tapes when i have my raid array plugged in some file system operation will freeze for minutes at a time and seemingly halt every program that is accessing any disk on the file system even if that program is only accessing files on the internal ssd so basically everything except the cursor and some occasional ui elements freeze up for minutes at a time if i m lucky it unfreezes in or maybe minutes otherwise i have to pull the cable manually and sometimes hard reboot the raid array itself i also have to guess whether the cause was purely the raid array or possibly related to something else like our corporate spyware or other bad macos behaviour when connected to slow smb drives combined with that doesn t inspire a lot of confidence in reliability for getting back my data in the future i keep multiple backups but i still expect a raid array promise to not be this broken given how much apple seems to position them as the premium just works option for apple computers
1
10,164
3,088,444,931
IssuesEvent
2015-08-25 16:35:40
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Support rancher behind an ELB in AWS with SSL
status/resolved status/to-test
We support running rancher in HA mode behind a number of different proxies. However, our current implementation does not support ELB. This is because Rancher uses websockets and ELB, when in HTTP/HTTPS mode, does not support websockets. ELB does support websockets if it is configured in TCP mode and [Proxy Protocol](http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt) is explicitly enabled. But we had to make some changes on the rancher side to support Proxy Protocol and this issue represents that work. So now, if you configure your ELB to be in Proxy Protocol mode, Rancher will work with no additional configuration. High level steps to test (assuming SSL): 1. Get a certificate for your domain (For example, I use craig.rancher.io). I can help you with this step. 2. Create an instance in amazon and run rancher/server on it. 3. In the AWS UI, create an ELB and configure it for TCP mode.* 4. From the AWS CLI, configure the above ELB to use Proxy protocol* 5. Verify basic functionality of rancher works: deploying a container, viewing stats, logs, exec. \* Detailed steps for setting up a Proxy Protocol ELB: http://docs.aws.amazon.com/ElasticLoadBalancing/latest/DeveloperGuide/enable-proxy-protocol.html \* Your ELB should be configured with TLS/SSL for the frontend and TCP for the backend. This causes SSL to be terminated at the load balancer.
1.0
Support rancher behind an ELB in AWS with SSL - We support running rancher in HA mode behind a number of different proxies. However, our current implementation does not support ELB. This is because Rancher uses websockets and ELB, when in HTTP/HTTPS mode, does not support websockets. ELB does support websockets if it is configured in TCP mode and [Proxy Protocol](http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt) is explicitly enabled. But we had to make some changes on the rancher side to support Proxy Protocol and this issue represents that work. So now, if you configure your ELB to be in Proxy Protocol mode, Rancher will work with no additional configuration. High level steps to test (assuming SSL): 1. Get a certificate for your domain (For example, I use craig.rancher.io). I can help you with this step. 2. Create an instance in amazon and run rancher/server on it. 3. In the AWS UI, create an ELB and configure it for TCP mode.* 4. From the AWS CLI, configure the above ELB to use Proxy protocol* 5. Verify basic functionality of rancher works: deploying a container, viewing stats, logs, exec. \* Detailed steps for setting up a Proxy Protocol ELB: http://docs.aws.amazon.com/ElasticLoadBalancing/latest/DeveloperGuide/enable-proxy-protocol.html \* Your ELB should be configured with TLS/SSL for the frontend and TCP for the backend. This causes SSL to be terminated at the load balancer.
non_usab
support rancher behind an elb in aws with ssl we support running rancher in ha mode behind a number of different proxies however our current implementation does not support elb this is because rancher uses websockets and elb when in http https mode does not support websockets elb does support websockets if it is configured in tcp mode and is explicitly enabled but we had to make some changes on the rancher side to support proxy protocol and this issue represents that work so now if you configure your elb to be in proxy protocol mode rancher will work with no additional configuration high level steps to test assuming ssl get a certificate for your domain for example i use craig rancher io i can help you with this step create an instance in amazon and run rancher server on it in the aws ui create an elb and configure it for tcp mode from the aws cli configure the above elb to use proxy protocol verify basic functionality of rancher works deploying a container viewing stats logs exec detailed steps for setting up a proxy protocol elb your elb should be configured with tls ssl for the frontend and tcp for the backend this causes ssl to be terminated at the load balancer
0
24,862
24,395,225,145
IssuesEvent
2022-10-04 18:37:44
Leafwing-Studios/leafwing_input_playback
https://api.github.com/repos/Leafwing-Studios/leafwing_input_playback
closed
Optionally exit the app when the serialized input file is complete
usability
## What problem does this solve? When using this library for testing, you are almost always going to want to quit the app once all of the serialized input has been played back. ## What solution would you like? Users can configure a resource to control whether or not this occurs (off by default). ## Related work Requires #11.
True
Optionally exit the app when the serialized input file is complete - ## What problem does this solve? When using this library for testing, you are almost always going to want to quit the app once all of the serialized input has been played back. ## What solution would you like? Users can configure a resource to control whether or not this occurs (off by default). ## Related work Requires #11.
usab
optionally exit the app when the serialized input file is complete what problem does this solve when using this library for testing you are almost always going to want to quit the app once all of the serialized input has been played back what solution would you like users can configure a resource to control whether or not this occurs off by default related work requires
1
26,057
26,370,346,976
IssuesEvent
2023-01-11 20:06:04
opensearch-project/documentation-website
https://api.github.com/repos/opensearch-project/documentation-website
closed
[RFC] [DOC] Add feedback mechanism
Usability ux / ui Closed - Complete
**What do you want to do?** - [x] Request a change to existing documentation - [ ] Add new documentation - [ ] Report a technical problem with the documentation - [ ] Other **Tell us about your request.** Provide a summary of the request and all versions that are affected. Currently, the only way for users to leave feedback on documentation is to enter a Github issue. While this works fairly well, we have no quantitative way to measure if a user is happy or unhappy with the content. Therefore, we should add the a feedback widget to each doc page, whether its a star rating (https://www.twilio.com/docs/autopilot/guides/how-to-build-a-chatbot), or a thumbs up, thumbs down mechanism. When a user selects the feedback mechanism, a modal would pop up allowing them to explain their rating. To add this mechanism, it would require: 1. Changing the default template in Jekyll. 2. Creating a liquid include or some other form to add interaction to the rating. 3. Work with design to determine how the mechanism should look [Nice-to-have] We can use Google Analytics to capture the data. **What other resources are available?** Provide links to related issues, POCs, steps for testing, etc.
True
[RFC] [DOC] Add feedback mechanism - **What do you want to do?** - [x] Request a change to existing documentation - [ ] Add new documentation - [ ] Report a technical problem with the documentation - [ ] Other **Tell us about your request.** Provide a summary of the request and all versions that are affected. Currently, the only way for users to leave feedback on documentation is to enter a Github issue. While this works fairly well, we have no quantitative way to measure if a user is happy or unhappy with the content. Therefore, we should add the a feedback widget to each doc page, whether its a star rating (https://www.twilio.com/docs/autopilot/guides/how-to-build-a-chatbot), or a thumbs up, thumbs down mechanism. When a user selects the feedback mechanism, a modal would pop up allowing them to explain their rating. To add this mechanism, it would require: 1. Changing the default template in Jekyll. 2. Creating a liquid include or some other form to add interaction to the rating. 3. Work with design to determine how the mechanism should look [Nice-to-have] We can use Google Analytics to capture the data. **What other resources are available?** Provide links to related issues, POCs, steps for testing, etc.
usab
add feedback mechanism what do you want to do request a change to existing documentation add new documentation report a technical problem with the documentation other tell us about your request provide a summary of the request and all versions that are affected currently the only way for users to leave feedback on documentation is to enter a github issue while this works fairly well we have no quantitative way to measure if a user is happy or unhappy with the content therefore we should add the a feedback widget to each doc page whether its a star rating or a thumbs up thumbs down mechanism when a user selects the feedback mechanism a modal would pop up allowing them to explain their rating to add this mechanism it would require changing the default template in jekyll creating a liquid include or some other form to add interaction to the rating work with design to determine how the mechanism should look we can use google analytics to capture the data what other resources are available provide links to related issues pocs steps for testing etc
1
392,784
11,595,795,288
IssuesEvent
2020-02-24 17:40:54
nmstate/nmstate
https://api.github.com/repos/nmstate/nmstate
opened
Show name of Ovs Bridge link-aggregration ports first
Low priority good first issue
nmstatectl should show the name of link-aggregration ports first in the output, for example: ```yaml - name: ovs-br-lagg0 type: ovs-bridge state: up bridge: port: - name: veth0_ep - name: ovs-lagg0 link-aggregation: mode: balance-slb slaves: - name: veth1_ep - name: veth2_ep ``` This needs adjustments to order the state recursively (or at least specificially for ovs-bridges): https://github.com/nmstate/nmstate/blob/master/libnmstate/prettystate.py#L91
1.0
Show name of Ovs Bridge link-aggregration ports first - nmstatectl should show the name of link-aggregration ports first in the output, for example: ```yaml - name: ovs-br-lagg0 type: ovs-bridge state: up bridge: port: - name: veth0_ep - name: ovs-lagg0 link-aggregation: mode: balance-slb slaves: - name: veth1_ep - name: veth2_ep ``` This needs adjustments to order the state recursively (or at least specificially for ovs-bridges): https://github.com/nmstate/nmstate/blob/master/libnmstate/prettystate.py#L91
non_usab
show name of ovs bridge link aggregration ports first nmstatectl should show the name of link aggregration ports first in the output for example yaml name ovs br type ovs bridge state up bridge port name ep name ovs link aggregation mode balance slb slaves name ep name ep this needs adjustments to order the state recursively or at least specificially for ovs bridges
0
188,094
15,116,182,576
IssuesEvent
2021-02-09 06:14:13
avargas20/git_web_practice
https://api.github.com/repos/avargas20/git_web_practice
closed
Un commit que no sigue la convenciรณn de cรณdigo o FIX a realizar
documentation
L El รบltimo commit tiene el siguiente mensaje: `Correcion Issue1` Este issue es solo un recordatorio de la convenciรณn de comentarios en los commits y puede ser cerrado.
1.0
Un commit que no sigue la convenciรณn de cรณdigo o FIX a realizar - L El รบltimo commit tiene el siguiente mensaje: `Correcion Issue1` Este issue es solo un recordatorio de la convenciรณn de comentarios en los commits y puede ser cerrado.
non_usab
un commit que no sigue la convenciรณn de cรณdigo o fix a realizar l el รบltimo commit tiene el siguiente mensaje correcion este issue es solo un recordatorio de la convenciรณn de comentarios en los commits y puede ser cerrado
0
15,360
3,953,379,754
IssuesEvent
2016-04-29 13:16:51
trivago/gollum
https://api.github.com/repos/trivago/gollum
closed
Kinesis Producer StreamMapping incorrectly documented?
documentation
The examples and documentation for the Kinesis producer seem to be slightly wrong. I think it should be 'StreamMapping' and not 'StreamMap'.
1.0
Kinesis Producer StreamMapping incorrectly documented? - The examples and documentation for the Kinesis producer seem to be slightly wrong. I think it should be 'StreamMapping' and not 'StreamMap'.
non_usab
kinesis producer streammapping incorrectly documented the examples and documentation for the kinesis producer seem to be slightly wrong i think it should be streammapping and not streammap
0