Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
290,152
8,882,979,643
IssuesEvent
2019-01-14 14:39:19
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
m-in.gearbest.com - site is not usable
browser-firefox priority-important
<!-- @browser: Firefox 65.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:65.0) Gecko/20100101 Firefox/65.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136 **Browser / Version**: Firefox 65.0 **Operating System**: Windows 7 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: some virus attach **Steps to Reproduce**: no need [![Screenshot Description](https://webcompat.com/uploads/2019/1/06515b05-176b-4b32-b6c6-956a20071e0e-thumb.jpeg)](https://webcompat.com/uploads/2019/1/06515b05-176b-4b32-b6c6-956a20071e0e.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190110221328</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Loading failed for the <script> with source https://m-in.gearbest.com/akam/10/5b5bf3b." {file: "https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136" line: 27}]', u'[JavaScript Error: "Failed to register/update a ServiceWorker for scope https://m-in.gearbest.com/: Storage access is restricted in this context due to user settings or private browsing mode." {file: "https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136" line: 163}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
m-in.gearbest.com - site is not usable - <!-- @browser: Firefox 65.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:65.0) Gecko/20100101 Firefox/65.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136 **Browser / Version**: Firefox 65.0 **Operating System**: Windows 7 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: some virus attach **Steps to Reproduce**: no need [![Screenshot Description](https://webcompat.com/uploads/2019/1/06515b05-176b-4b32-b6c6-956a20071e0e-thumb.jpeg)](https://webcompat.com/uploads/2019/1/06515b05-176b-4b32-b6c6-956a20071e0e.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20190110221328</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Loading failed for the <script> with source https://m-in.gearbest.com/akam/10/5b5bf3b." {file: "https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136" line: 27}]', u'[JavaScript Error: "Failed to register/update a ServiceWorker for scope https://m-in.gearbest.com/: Storage access is restricted in this context due to user settings or private browsing mode." {file: "https://m-in.gearbest.com/money-bag.html?lkid=18124852&cid=108621863625691136" line: 163}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
m in gearbest com site is not usable url browser version firefox operating system windows tested another browser yes problem type site is not usable description some virus attach steps to reproduce no need browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked false gfx webrender blob images true hastouchscreen false mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel beta console messages u from with ❤️
0
14,628
25,288,019,028
IssuesEvent
2022-11-16 21:05:35
NASA-PDS/validate
https://api.github.com/repos/NASA-PDS/validate
opened
As a use, I want to validate the DOI referenced in the PDS4 label
requirement needs:triage
<!-- For more information on how to populate this new feature request, see the PDS Wiki on User Story Development: https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Issue-Tracking#user-story-development --> ## 🧑‍🔬 User Persona(s) <!-- Ideally this would be consistent within a repo and documented with the requirements. --> Node data manager. ## 💪 Motivation ...so that I can <!-- why do you want to do this? --> Requested during Face 2 face Management Council meeting in November 2022. ## 📖 Additional Details <!-- Please prove any additional details or information that could help provide some context for the user story. --> To be completed ## ⚖️ Acceptance Criteria **Given** <!-- a condition --> **When I perform** <!-- an action --> **Then I expect** <!-- the result --> <!-- For Internal Dev Team Use --> ## ⚙️ Engineering Details <!-- Provide some design / implementation details and/or a sub-task checklist as needed. Convert issue to Epic if estimate is outside the scope of 1 sprint. -->
1.0
As a use, I want to validate the DOI referenced in the PDS4 label - <!-- For more information on how to populate this new feature request, see the PDS Wiki on User Story Development: https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Issue-Tracking#user-story-development --> ## 🧑‍🔬 User Persona(s) <!-- Ideally this would be consistent within a repo and documented with the requirements. --> Node data manager. ## 💪 Motivation ...so that I can <!-- why do you want to do this? --> Requested during Face 2 face Management Council meeting in November 2022. ## 📖 Additional Details <!-- Please prove any additional details or information that could help provide some context for the user story. --> To be completed ## ⚖️ Acceptance Criteria **Given** <!-- a condition --> **When I perform** <!-- an action --> **Then I expect** <!-- the result --> <!-- For Internal Dev Team Use --> ## ⚙️ Engineering Details <!-- Provide some design / implementation details and/or a sub-task checklist as needed. Convert issue to Epic if estimate is outside the scope of 1 sprint. -->
non_process
as a use i want to validate the doi referenced in the label for more information on how to populate this new feature request see the pds wiki on user story development 🧑‍🔬 user persona s node data manager 💪 motivation so that i can requested during face face management council meeting in november 📖 additional details to be completed ⚖️ acceptance criteria given when i perform then i expect ⚙️ engineering details provide some design implementation details and or a sub task checklist as needed convert issue to epic if estimate is outside the scope of sprint
0
13,435
8,461,159,638
IssuesEvent
2018-10-22 20:57:14
burtonator/polar-bookshelf
https://api.github.com/repos/burtonator/polar-bookshelf
closed
Display tag values in the main document repository
good first issue usability
It would be really nice to be able to see what tags have been applied to a document without clicking on the tag icon. Is it possible to have the tag values visible in a column?
True
Display tag values in the main document repository - It would be really nice to be able to see what tags have been applied to a document without clicking on the tag icon. Is it possible to have the tag values visible in a column?
non_process
display tag values in the main document repository it would be really nice to be able to see what tags have been applied to a document without clicking on the tag icon is it possible to have the tag values visible in a column
0
356,132
10,589,148,744
IssuesEvent
2019-10-09 04:59:50
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.nytimes.com - see bug description
browser-focus-geckoview engine-gecko priority-important
<!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: --> <!-- @extra_labels: browser-focus-geckoview --> **URL**: https://www.nytimes.com/2017/06/29/automobiles/wheels/why-fog-lamps-are-starting-to-disappear.html **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android **Tested Another Browser**: Yes **Problem type**: Something else **Description**: Indicates I am in private mode. Unable to read article. We all should have freedom to read articles in private mode **Steps to Reproduce**: Tried to request desktop site. Didn't work. Still shows the pop-up message that I am unable to dismiss. I couldn't get the desktop site to show. <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
www.nytimes.com - see bug description - <!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: --> <!-- @extra_labels: browser-focus-geckoview --> **URL**: https://www.nytimes.com/2017/06/29/automobiles/wheels/why-fog-lamps-are-starting-to-disappear.html **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android **Tested Another Browser**: Yes **Problem type**: Something else **Description**: Indicates I am in private mode. Unable to read article. We all should have freedom to read articles in private mode **Steps to Reproduce**: Tried to request desktop site. Didn't work. Still shows the pop-up message that I am unable to dismiss. I couldn't get the desktop site to show. <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
see bug description url browser version firefox mobile operating system android tested another browser yes problem type something else description indicates i am in private mode unable to read article we all should have freedom to read articles in private mode steps to reproduce tried to request desktop site didn t work still shows the pop up message that i am unable to dismiss i couldn t get the desktop site to show browser configuration none from with ❤️
0
64,939
26,922,332,339
IssuesEvent
2023-02-07 11:21:10
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Operating system is also a restriction to moving
app-service/svc triaged cxp doc-enhancement Pri1 escalated-content-team
You also can't move an app service to an app service plan that is of a different operating system, but that isnt specified here. --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 25ec88c2-d5ff-e727-cbe1-69bac94141ae * Version Independent ID: 0b6d31a9-7767-73cb-857b-ad8aa81cdce5 * Content: [Manage App Service plan - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/app-service-plan-manage) * Content Source: [articles/app-service/app-service-plan-manage.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/app-service-plan-manage.md) * Service: **app-service** * GitHub Login: @cephalin * Microsoft Alias: **cephalin**
1.0
Operating system is also a restriction to moving - You also can't move an app service to an app service plan that is of a different operating system, but that isnt specified here. --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 25ec88c2-d5ff-e727-cbe1-69bac94141ae * Version Independent ID: 0b6d31a9-7767-73cb-857b-ad8aa81cdce5 * Content: [Manage App Service plan - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/app-service-plan-manage) * Content Source: [articles/app-service/app-service-plan-manage.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/app-service-plan-manage.md) * Service: **app-service** * GitHub Login: @cephalin * Microsoft Alias: **cephalin**
non_process
operating system is also a restriction to moving you also can t move an app service to an app service plan that is of a different operating system but that isnt specified here document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service app service github login cephalin microsoft alias cephalin
0
3,774
6,743,651,736
IssuesEvent
2017-10-20 13:00:10
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
Use of ethabi should be removed
libs-etherlib status-inprocess type-bug
In the function **getEncoding** in the file ./etherlib/abi.cpp, there is code that looks for and tries to use an external program called `ethabi`. This almost certainly does not work and should be removed.
1.0
Use of ethabi should be removed - In the function **getEncoding** in the file ./etherlib/abi.cpp, there is code that looks for and tries to use an external program called `ethabi`. This almost certainly does not work and should be removed.
process
use of ethabi should be removed in the function getencoding in the file etherlib abi cpp there is code that looks for and tries to use an external program called ethabi this almost certainly does not work and should be removed
1
14,842
18,236,769,989
IssuesEvent
2021-10-01 07:59:55
quark-engine/quark-engine
https://api.github.com/repos/quark-engine/quark-engine
closed
Error occurred while analyzing APK
issue-processing-state-01
## Description Error occurred while analyzing APK with the rule. ## Rule content ``` { "crime": "Steal Sensitive Information - Get directory info of the SD cards and active network info", "x1_permission": [ "android.permission.WRITE_EXTERNAL_STORAGE", "android.permission.ACCESS_NETWORK_STATE" ], "x2n3n4_comb": [ { "class": "Landroid/os/Environment", "method": "getExternalStorageDirectory" }, { "class": "Landroid/net/ConnectivityManager", "method": "getActiveNetworkInfo" } ], "yscore": 1, "sample": [ "8acfbc89fa1c760c4cffc6178863e5529d93347c29fa972af7072cb520f49956d55b7f88155671c1d02b54905ec09abd744314d3d4a13c790116583e18f8ebe6", "83057e262e794c2c25a59c65d8604885b9dbe9f0880c38518752677508493e21fe63c27708347ae26b76bfed38992f5e0d67da8b273892a52339bba58c661105", "014b4e2e536d0f23dfefea6226932e25e76eaf0afa9ee7692f33a0573d6215080a74fabaad35daf0a5b639de554214290c67c802ac5aefd32469042f7ff069bf", "3b167d339e5a2027e39a8bfc55b4897ea55312e0ceee6b40903a02e33fecbd10e4c4d43ef1af36a262d01e78f352a3d6a13ecce3ed85d92c815201db8c8a7926", "d3623a24648cdadcab189b2c7bec118a2aa63ec5f84ef002eeef44ad5789481c4927bcd76be38bb9920120d6093321b208ea443598edd5c099e3526eefcf1553", "c53fe1a510e3003fb4b2743c045efda8b6fdb05e0a05ceb71682080b9476acdf8fec0ae1f8863614400521a743ace0165f4675dbf01e67a88980f321e9683444", "f4579b81e8cb6d19a5456bdad9215dfd01d06f24fc45b9474fff9264fcad2feddc9ebba4a02a858b26016faebd28e5623458578d0d24ed0312b75b5c2f516590", "94df58b777a539fa0574ab95fd4aa84c65c90977a0b9dfe21bf82544d9dd2a935b813ff2a3f85a9e749293309b6ce5459a3ec423870b737509eab0eaee50a5bf", "7deee7286601890557a318986cf9642ed8426131147c4bbde8d5dc7651843a452b1bf3a7aef37048c110d4bee46e76fd4381fa0c29835350d2eeecc05fa60bd7", "a13228c19303304968541917fad261b12a00646e49a37eb61c5232f37aa18d91d2af913e66f2785067c9910ae74d51aa0d146523c6c6419bc027ea3dbb81288b", "da44a5bd5d6ddb74eff1751dd8ca5bef36c7246c5d166284f374328c7de59bb38977e0c3dee694d619bd8e2b28733cd2045b7d63657076c1d40f4a22343f8d73", "fd9a6b886e9e30e90ac364446d98934142c24fef525c445ae5ceee169fb7124a2e17bfdf05177ac5d22b8b944c7aa78f8c181845509708e0f0ea60bb2afdbb43", "f8912459991773c9e28061251265aaf908c1edd960fc1c60ae7f141fc2c95eb2a92d3a3891d47f8f2c8d73596626d29c0409ed66a248d6f37c0f623a16d72ed9", "73804c796f7ae285fdba3c08d0742c46d758024020cb6614f0a75182c5af3b721a952f6b436226317631b163701f18ed2a8b9df2694a48cf9ee8a321234bacb4", "bfbd18234345807597d828842a2f1afadaaca5994337a1f728e3c8736e6ebd787907c9b4e79f43fd95c169351ec5e52f0f45542f6c6a363806d86072b5118e5d", "22eb834fc059bd28d112e9eea74c328ed5b7e92059e65c97bdd203a767b51eed640d733682ef8012f1a08c4c4aa60d88bb28d6bb9147cb0d1fbd8e8d599fa041", "954f6df7b4ce7f56232a797f7c5bc911a885e526ef92ca072060227ba97280fc4df3b8b605b99cfd24e1415718a630c174607e46ce0112eadf3c88151dd063d8", "8612bbb29dc17e584d817b90475b098646ea756f159249af410071c400afa911865cd55cf9cec9f9c0db95b00734885863ab75b1bdfd0e0c73f60b4ac680a483", "622821fb262951a2275c20b8fca4eac7c33a8044f0d984e054c3df4acf4da8a914058dc563d96f8f6985b1ac64d8cf93b1c54581e0c909a8011d69c6a740c739", "37290d8ba66e0b9f4f7398123c55c5a834e498aa030edb1109cb26b28d85ca2663a876c6cf7510a6ef1174b861905df76617fe7e2c7ab037d0f1539f26cf278e", "5751cfdf656f2a5ee021940c5448a77e5b921d1510d2abfa520a57d02c74821e0f5c2e4935bea2554c440072d32fc22bb8317a85dabbbc7c9cca9d1c077793c2", "398acd50a4931316d4c5270cfd4fbf56032565a5e4d7265eb29deb1595ccb5cf748667ab39fa02364669d217d871d5308b6a2ad7a9b7fafdda6abe709012d191", "2d48ff0bbea51fc61aeb7dbae99f4252f65f8d37b0c949cec40bb788ebc6dfe6f8aa02ee12804d7fb34fdd5def7b28e2de9e66d282acf8f85750edfd3b90e9f2", "31c57d6d321a0affff1e2c880c6f5da7d82ab24f0ac2dae6876d17b1f6cc7266a0b046048ee493ffdbc5313ecc05d560b0971dfd0fce9ee36483e18e3e5f1283", "934573215329244f74ad7090604d4410732559ea0c981c466d6dee13a2ba6b8533d9d7171af6f5fb7ae07377a495cee8b77e5da1dc1eca68158b3c76395f8b1c", "4f60446b97ce44b297e17e334de6d870f5de38ef1d74b2557f3bf684178363d319a5d0e4147f4cc55784e79a8f4eb2de0b6448fc103f5266b8c78b8bd2bca783" ] } ``` ## Error message ``` Traceback (most recent call last): File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/bin/quark", line 11, in <module> load_entry_point('quark-engine', 'console_scripts', 'quark')() File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/home/nick/quark/quark-engine/quark/cli.py", line 93, in entry_point data.run(rule_checker) File "/home/nick/quark/quark-engine/quark/Objects/quark.py", line 304, in run if self.check_parameter(common_method, str(pre_0[1]), str(pre_1[1])): File "/home/nick/quark/quark-engine/quark/Objects/quark.py", line 210, in check_parameter pyeval.eval[instruction[0]](instruction) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 28, in warp func(*args, **kwargs) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 251, in CONST_SIXTEEN self._assign_value(instruction) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 113, in _assign_value self.table_obj.insert(index, variable_object) File "/home/nick/quark/quark-engine/quark/Objects/tableobject.py", line 28, in insert self.hash_table[index].append(var_obj) IndexError: list index out of range ```
1.0
Error occurred while analyzing APK - ## Description Error occurred while analyzing APK with the rule. ## Rule content ``` { "crime": "Steal Sensitive Information - Get directory info of the SD cards and active network info", "x1_permission": [ "android.permission.WRITE_EXTERNAL_STORAGE", "android.permission.ACCESS_NETWORK_STATE" ], "x2n3n4_comb": [ { "class": "Landroid/os/Environment", "method": "getExternalStorageDirectory" }, { "class": "Landroid/net/ConnectivityManager", "method": "getActiveNetworkInfo" } ], "yscore": 1, "sample": [ "8acfbc89fa1c760c4cffc6178863e5529d93347c29fa972af7072cb520f49956d55b7f88155671c1d02b54905ec09abd744314d3d4a13c790116583e18f8ebe6", "83057e262e794c2c25a59c65d8604885b9dbe9f0880c38518752677508493e21fe63c27708347ae26b76bfed38992f5e0d67da8b273892a52339bba58c661105", "014b4e2e536d0f23dfefea6226932e25e76eaf0afa9ee7692f33a0573d6215080a74fabaad35daf0a5b639de554214290c67c802ac5aefd32469042f7ff069bf", "3b167d339e5a2027e39a8bfc55b4897ea55312e0ceee6b40903a02e33fecbd10e4c4d43ef1af36a262d01e78f352a3d6a13ecce3ed85d92c815201db8c8a7926", "d3623a24648cdadcab189b2c7bec118a2aa63ec5f84ef002eeef44ad5789481c4927bcd76be38bb9920120d6093321b208ea443598edd5c099e3526eefcf1553", "c53fe1a510e3003fb4b2743c045efda8b6fdb05e0a05ceb71682080b9476acdf8fec0ae1f8863614400521a743ace0165f4675dbf01e67a88980f321e9683444", "f4579b81e8cb6d19a5456bdad9215dfd01d06f24fc45b9474fff9264fcad2feddc9ebba4a02a858b26016faebd28e5623458578d0d24ed0312b75b5c2f516590", "94df58b777a539fa0574ab95fd4aa84c65c90977a0b9dfe21bf82544d9dd2a935b813ff2a3f85a9e749293309b6ce5459a3ec423870b737509eab0eaee50a5bf", "7deee7286601890557a318986cf9642ed8426131147c4bbde8d5dc7651843a452b1bf3a7aef37048c110d4bee46e76fd4381fa0c29835350d2eeecc05fa60bd7", "a13228c19303304968541917fad261b12a00646e49a37eb61c5232f37aa18d91d2af913e66f2785067c9910ae74d51aa0d146523c6c6419bc027ea3dbb81288b", "da44a5bd5d6ddb74eff1751dd8ca5bef36c7246c5d166284f374328c7de59bb38977e0c3dee694d619bd8e2b28733cd2045b7d63657076c1d40f4a22343f8d73", "fd9a6b886e9e30e90ac364446d98934142c24fef525c445ae5ceee169fb7124a2e17bfdf05177ac5d22b8b944c7aa78f8c181845509708e0f0ea60bb2afdbb43", "f8912459991773c9e28061251265aaf908c1edd960fc1c60ae7f141fc2c95eb2a92d3a3891d47f8f2c8d73596626d29c0409ed66a248d6f37c0f623a16d72ed9", "73804c796f7ae285fdba3c08d0742c46d758024020cb6614f0a75182c5af3b721a952f6b436226317631b163701f18ed2a8b9df2694a48cf9ee8a321234bacb4", "bfbd18234345807597d828842a2f1afadaaca5994337a1f728e3c8736e6ebd787907c9b4e79f43fd95c169351ec5e52f0f45542f6c6a363806d86072b5118e5d", "22eb834fc059bd28d112e9eea74c328ed5b7e92059e65c97bdd203a767b51eed640d733682ef8012f1a08c4c4aa60d88bb28d6bb9147cb0d1fbd8e8d599fa041", "954f6df7b4ce7f56232a797f7c5bc911a885e526ef92ca072060227ba97280fc4df3b8b605b99cfd24e1415718a630c174607e46ce0112eadf3c88151dd063d8", "8612bbb29dc17e584d817b90475b098646ea756f159249af410071c400afa911865cd55cf9cec9f9c0db95b00734885863ab75b1bdfd0e0c73f60b4ac680a483", "622821fb262951a2275c20b8fca4eac7c33a8044f0d984e054c3df4acf4da8a914058dc563d96f8f6985b1ac64d8cf93b1c54581e0c909a8011d69c6a740c739", "37290d8ba66e0b9f4f7398123c55c5a834e498aa030edb1109cb26b28d85ca2663a876c6cf7510a6ef1174b861905df76617fe7e2c7ab037d0f1539f26cf278e", "5751cfdf656f2a5ee021940c5448a77e5b921d1510d2abfa520a57d02c74821e0f5c2e4935bea2554c440072d32fc22bb8317a85dabbbc7c9cca9d1c077793c2", "398acd50a4931316d4c5270cfd4fbf56032565a5e4d7265eb29deb1595ccb5cf748667ab39fa02364669d217d871d5308b6a2ad7a9b7fafdda6abe709012d191", "2d48ff0bbea51fc61aeb7dbae99f4252f65f8d37b0c949cec40bb788ebc6dfe6f8aa02ee12804d7fb34fdd5def7b28e2de9e66d282acf8f85750edfd3b90e9f2", "31c57d6d321a0affff1e2c880c6f5da7d82ab24f0ac2dae6876d17b1f6cc7266a0b046048ee493ffdbc5313ecc05d560b0971dfd0fce9ee36483e18e3e5f1283", "934573215329244f74ad7090604d4410732559ea0c981c466d6dee13a2ba6b8533d9d7171af6f5fb7ae07377a495cee8b77e5da1dc1eca68158b3c76395f8b1c", "4f60446b97ce44b297e17e334de6d870f5de38ef1d74b2557f3bf684178363d319a5d0e4147f4cc55784e79a8f4eb2de0b6448fc103f5266b8c78b8bd2bca783" ] } ``` ## Error message ``` Traceback (most recent call last): File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/bin/quark", line 11, in <module> load_entry_point('quark-engine', 'console_scripts', 'quark')() File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/nick/.local/share/virtualenvs/quark-engine-lddAoX17/lib/python3.7/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/home/nick/quark/quark-engine/quark/cli.py", line 93, in entry_point data.run(rule_checker) File "/home/nick/quark/quark-engine/quark/Objects/quark.py", line 304, in run if self.check_parameter(common_method, str(pre_0[1]), str(pre_1[1])): File "/home/nick/quark/quark-engine/quark/Objects/quark.py", line 210, in check_parameter pyeval.eval[instruction[0]](instruction) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 28, in warp func(*args, **kwargs) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 251, in CONST_SIXTEEN self._assign_value(instruction) File "/home/nick/quark/quark-engine/quark/Evaluator/pyeval.py", line 113, in _assign_value self.table_obj.insert(index, variable_object) File "/home/nick/quark/quark-engine/quark/Objects/tableobject.py", line 28, in insert self.hash_table[index].append(var_obj) IndexError: list index out of range ```
process
error occurred while analyzing apk description error occurred while analyzing apk with the rule rule content crime steal sensitive information get directory info of the sd cards and active network info permission android permission write external storage android permission access network state comb class landroid os environment method getexternalstoragedirectory class landroid net connectivitymanager method getactivenetworkinfo yscore sample error message traceback most recent call last file home nick local share virtualenvs quark engine bin quark line in load entry point quark engine console scripts quark file home nick local share virtualenvs quark engine lib site packages click core py line in call return self main args kwargs file home nick local share virtualenvs quark engine lib site packages click core py line in main rv self invoke ctx file home nick local share virtualenvs quark engine lib site packages click core py line in invoke return ctx invoke self callback ctx params file home nick local share virtualenvs quark engine lib site packages click core py line in invoke return callback args kwargs file home nick quark quark engine quark cli py line in entry point data run rule checker file home nick quark quark engine quark objects quark py line in run if self check parameter common method str pre str pre file home nick quark quark engine quark objects quark py line in check parameter pyeval eval instruction file home nick quark quark engine quark evaluator pyeval py line in warp func args kwargs file home nick quark quark engine quark evaluator pyeval py line in const sixteen self assign value instruction file home nick quark quark engine quark evaluator pyeval py line in assign value self table obj insert index variable object file home nick quark quark engine quark objects tableobject py line in insert self hash table append var obj indexerror list index out of range
1
700,000
24,041,148,644
IssuesEvent
2022-09-16 02:01:33
googleapis/gapic-generator-ruby
https://api.github.com/repos/googleapis/gapic-generator-ruby
opened
Spurious cross-reference YARD conversions happen when the target is in a gem dependency
type: bug priority: p2
This appears in `google-cloud-firestore-admin-v1/proto_docs/google/firestore/admin/v1/location.rb` It is currently converting `[google.cloud.location.Location.metadata][google.cloud.location.Location.metadata]` to `{::Google::Cloud::Location::Location#metadata google.cloud.location.Location.metadata}`, but the class is defined in a dependency (google-cloud-location); hence YARD tests fail. (We are currently working around this with custom owlbot scripts.)
1.0
Spurious cross-reference YARD conversions happen when the target is in a gem dependency - This appears in `google-cloud-firestore-admin-v1/proto_docs/google/firestore/admin/v1/location.rb` It is currently converting `[google.cloud.location.Location.metadata][google.cloud.location.Location.metadata]` to `{::Google::Cloud::Location::Location#metadata google.cloud.location.Location.metadata}`, but the class is defined in a dependency (google-cloud-location); hence YARD tests fail. (We are currently working around this with custom owlbot scripts.)
non_process
spurious cross reference yard conversions happen when the target is in a gem dependency this appears in google cloud firestore admin proto docs google firestore admin location rb it is currently converting to google cloud location location metadata google cloud location location metadata but the class is defined in a dependency google cloud location hence yard tests fail we are currently working around this with custom owlbot scripts
0
3,897
6,821,572,542
IssuesEvent
2017-11-07 17:08:37
WormBase/wormbase-pipeline
https://api.github.com/repos/WormBase/wormbase-pipeline
closed
Knocking in human variants into the worm ortholog - Ranjana
MODELS Processing
This is for Tim Schedl’s point, relevant quote below: "However, there are already a number of published examples of knocking in human variants into the worm ortholog. Examples below. (In this case, the allele designation used would just be the existing designation of the worm lab.) The important point is that the disease associated allele is set aside and advertised as such with the use of headings." For now the best way I can think of doing this (without Hinxton getting involved at the mapping between worm and human at the sequence level) is to make use of the simple ?Text with a disease_association tag in the Disease_model_annotation model. In the inline ?Text I can put in text such as “Contains mutation equivalent to the human blah mutation” or “contains human blah variant" After all if authors say they knocked in human variants in the worm, then this data is tied to the paper and the experiment and this is what the Disease_model_annotation class sets out to capture. So in ?Disease_model_annotation, we currently have the following: Modeled_by Strain UNIQUE ?Strain can I please have: Modeled_by Strain UNIQUE ?Strain Strain_disease_association ?Text Similarly add to the Variation, Transgene and Disease_relevant_gene lines, so Modeled_by Variation UNIQUE ?Variation Variation_disease_association ?Text Transgene UNIQUE ?Transgene Transgene_disease_association ?Text Disease_relevant_gene UNIQUE ? Gene_disease_association ?Text Hope this works?
1.0
Knocking in human variants into the worm ortholog - Ranjana - This is for Tim Schedl’s point, relevant quote below: "However, there are already a number of published examples of knocking in human variants into the worm ortholog. Examples below. (In this case, the allele designation used would just be the existing designation of the worm lab.) The important point is that the disease associated allele is set aside and advertised as such with the use of headings." For now the best way I can think of doing this (without Hinxton getting involved at the mapping between worm and human at the sequence level) is to make use of the simple ?Text with a disease_association tag in the Disease_model_annotation model. In the inline ?Text I can put in text such as “Contains mutation equivalent to the human blah mutation” or “contains human blah variant" After all if authors say they knocked in human variants in the worm, then this data is tied to the paper and the experiment and this is what the Disease_model_annotation class sets out to capture. So in ?Disease_model_annotation, we currently have the following: Modeled_by Strain UNIQUE ?Strain can I please have: Modeled_by Strain UNIQUE ?Strain Strain_disease_association ?Text Similarly add to the Variation, Transgene and Disease_relevant_gene lines, so Modeled_by Variation UNIQUE ?Variation Variation_disease_association ?Text Transgene UNIQUE ?Transgene Transgene_disease_association ?Text Disease_relevant_gene UNIQUE ? Gene_disease_association ?Text Hope this works?
process
knocking in human variants into the worm ortholog ranjana this is for tim schedl’s point relevant quote below however there are already a number of published examples of knocking in human variants into the worm ortholog examples below in this case the allele designation used would just be the existing designation of the worm lab the important point is that the disease associated allele is set aside and advertised as such with the use of headings for now the best way i can think of doing this without hinxton getting involved at the mapping between worm and human at the sequence level is to make use of the simple text with a disease association tag in the disease model annotation model in the inline text i can put in text such as “contains mutation equivalent to the human blah mutation” or “contains human blah variant after all if authors say they knocked in human variants in the worm then this data is tied to the paper and the experiment and this is what the disease model annotation class sets out to capture so in disease model annotation we currently have the following modeled by strain unique strain can i please have modeled by strain unique strain strain disease association text similarly add to the variation transgene and disease relevant gene lines so modeled by variation unique variation variation disease association text transgene unique transgene transgene disease association text disease relevant gene unique gene disease association text hope this works
1
2,845
5,808,169,366
IssuesEvent
2017-05-04 09:55:04
Hurence/logisland
https://api.github.com/repos/Hurence/logisland
closed
add Netflow telemetry Processor
cyber-security feature processor
find the right external to collect netflow data into Kafka write ParseNetflowProcessor to handle this data source write Avro schema wich conforms to Spot Open Data Model - code / test / doc - tutorial - kibana dashboard
1.0
add Netflow telemetry Processor - find the right external to collect netflow data into Kafka write ParseNetflowProcessor to handle this data source write Avro schema wich conforms to Spot Open Data Model - code / test / doc - tutorial - kibana dashboard
process
add netflow telemetry processor find the right external to collect netflow data into kafka write parsenetflowprocessor to handle this data source write avro schema wich conforms to spot open data model code test doc tutorial kibana dashboard
1
20,156
26,708,115,229
IssuesEvent
2023-01-27 20:14:10
eosnetworkfoundation/devrel
https://api.github.com/repos/eosnetworkfoundation/devrel
closed
Define create/update/rename process for content using GitHub
Process
Document a process to create/update/rename AC: - All documents need to provide an example of doing this internally and externally - Define the process for creating a new document - Define the process for updating existing documents - Define the process for renaming a document - Place process in devrel repo
1.0
Define create/update/rename process for content using GitHub - Document a process to create/update/rename AC: - All documents need to provide an example of doing this internally and externally - Define the process for creating a new document - Define the process for updating existing documents - Define the process for renaming a document - Place process in devrel repo
process
define create update rename process for content using github document a process to create update rename ac all documents need to provide an example of doing this internally and externally define the process for creating a new document define the process for updating existing documents define the process for renaming a document place process in devrel repo
1
3,330
6,447,624,966
IssuesEvent
2017-08-14 08:17:46
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
reopened
ntr tRNA 2'-O-methyltransferase
New term request PomBase RNA processes
I could only find tRNA (guanosine-2'-O-)-methyltransferase activity child of GO:0008171 O-methyltransferase activity GO:0008175 tRNA methyltransferase activity http://www.yeastgenome.org/reference/S000120470/overview (I'm annotating the ortholog)
1.0
ntr tRNA 2'-O-methyltransferase - I could only find tRNA (guanosine-2'-O-)-methyltransferase activity child of GO:0008171 O-methyltransferase activity GO:0008175 tRNA methyltransferase activity http://www.yeastgenome.org/reference/S000120470/overview (I'm annotating the ortholog)
process
ntr trna o methyltransferase i could only find trna guanosine o methyltransferase activity child of go o methyltransferase activity go trna methyltransferase activity i m annotating the ortholog
1
7,183
10,323,338,549
IssuesEvent
2019-08-31 20:28:38
rusty-snake/firejailed-tor-browser
https://api.github.com/repos/rusty-snake/firejailed-tor-browser
opened
firejailed-tor-browser.profile backporting
process tracking
# observation of backporting Previous issue: #6 ## 0.9.60 - [ ] seccomp changes ## 0.9.58 - [ ] seccomp changes ## 0.9.56 ALL ## 0.9.52 ALL
1.0
firejailed-tor-browser.profile backporting - # observation of backporting Previous issue: #6 ## 0.9.60 - [ ] seccomp changes ## 0.9.58 - [ ] seccomp changes ## 0.9.56 ALL ## 0.9.52 ALL
process
firejailed tor browser profile backporting observation of backporting previous issue seccomp changes seccomp changes all all
1
461,862
13,237,337,782
IssuesEvent
2020-08-18 21:28:14
Railcraft/Railcraft
https://api.github.com/repos/Railcraft/Railcraft
closed
java.lang.NullPointerException when loading railcraft with forge 14.23.5.2838
bug configuration low priority
**Description of the Bug** minecraft 1.12.2 using forge 14.23.5.2838 crashes an reports the following error while loading net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft) Caused by: java.lang.NullPointerException Please find the crash report below: ---- Minecraft Crash Report ---- WARNING: coremods are present: llibrary (llibrary-core-1.0.11-1.12.2.jar) IELoadingPlugin (ImmersiveEngineering-core-0.12-89.jar) weaponlib (mw_2.0-0.4.1_mc1.12.2.jar) TransformerLoader (OpenComputers-MC1.12.2-1.7.5.192.jar) AppleCore (AppleCore-mc1.12.2-3.2.0.jar) EnderCorePlugin (EnderCore-1.12.2.jar) TransformLoader (DynamicSurroundings-1.12.2-3.6.0.2.jar) Contact their authors BEFORE contacting forge // Surprise! Haha. Well, this is awkward. Time: 1/24/20 10:01 PM Description: There was a severe problem during mod loading that has caused the game to fail net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft) Caused by: java.lang.NullPointerException at mods.railcraft.common.fluids.RailcraftFluids$Def.initBlock(RailcraftFluids.java:96) at mods.railcraft.common.fluids.RailcraftFluids$Def.access$500(RailcraftFluids.java:78) at mods.railcraft.common.fluids.RailcraftFluids.init(RailcraftFluids.java:130) at mods.railcraft.common.fluids.RailcraftFluids.preInitFluids(RailcraftFluids.java:118) at mods.railcraft.common.modules.ModuleCore$1.construction(ModuleCore.java:103) at mods.railcraft.common.modules.RailcraftModulePayload$BaseModuleEventHandler.construction(RailcraftModulePayload.java:71) at mods.railcraft.common.modules.RailcraftModuleManager$Stage$1.passToModule(RailcraftModuleManager.java:293) at mods.railcraft.common.modules.RailcraftModuleManager.processStage(RailcraftModuleManager.java:250) at mods.railcraft.common.modules.RailcraftModuleManager.preInit(RailcraftModuleManager.java:198) at mods.railcraft.common.core.Railcraft.preInit(Railcraft.java:116) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at net.minecraftforge.fml.common.FMLModContainer.handleModStateEvent(FMLModContainer.java:637) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:91) at com.google.common.eventbus.Subscriber$SynchronizedSubscriber.invokeSubscriberMethod(Subscriber.java:150) at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:76) at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399) at com.google.common.eventbus.Subscriber.dispatchEvent(Subscriber.java:71) at com.google.common.eventbus.Dispatcher$PerThreadQueuedDispatcher.dispatch(Dispatcher.java:116) at com.google.common.eventbus.EventBus.post(EventBus.java:217) at net.minecraftforge.fml.common.LoadController.sendEventToModContainer(LoadController.java:219) at net.minecraftforge.fml.common.LoadController.propogateStateMessage(LoadController.java:197) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:91) at com.google.common.eventbus.Subscriber$SynchronizedSubscriber.invokeSubscriberMethod(Subscriber.java:150) at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:76) at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399) at com.google.common.eventbus.Subscriber.dispatchEvent(Subscriber.java:71) at com.google.common.eventbus.Dispatcher$PerThreadQueuedDispatcher.dispatch(Dispatcher.java:116) at com.google.common.eventbus.EventBus.post(EventBus.java:217) at net.minecraftforge.fml.common.LoadController.distributeStateMessage(LoadController.java:136) at net.minecraftforge.fml.common.Loader.preinitializeMods(Loader.java:627) at net.minecraftforge.fml.client.FMLClientHandler.beginMinecraftLoading(FMLClientHandler.java:252) at net.minecraft.client.Minecraft.func_71384_a(Minecraft.java:467) at net.minecraft.client.Minecraft.func_99999_d(Minecraft.java:378) at net.minecraft.client.main.Main.main(SourceFile:123) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at net.minecraft.launchwrapper.Launch.launch(Launch.java:135) at net.minecraft.launchwrapper.Launch.main(Launch.java:28) A detailed walkthrough of the error, its code path and all known details is as follows: --------------------------------------------------------------------------------------- -- System Details -- Details: Minecraft Version: 1.12.2 Operating System: Windows 10 (amd64) version 10.0 Java Version: 1.8.0_51, Oracle Corporation Java VM Version: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation Memory: 4661590680 bytes (4445 MB) / 6878658560 bytes (6560 MB) up to 17179869184 bytes (16384 MB) JVM Flags: 8 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xmx16G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=16M IntCache: cache: 0, tcache: 0, allocated: 0, tallocated: 0 FML: MCP 9.42 Powered by Forge 14.23.5.2838 Optifine OptiFine_1.12.2_HD_U_F4 76 mods loaded, 76 mods active States: 'U' = Unloaded 'L' = Loaded 'C' = Constructed 'H' = Pre-initialized 'I' = Initialized 'J' = Post-initialized 'A' = Available 'D' = Disabled 'E' = Errored | State | ID | Version | Source | Signature | |:----- |:--------------------------------- |:--------------- |:----------------------------------------------- |:---------------------------------------- | | LCH | minecraft | 1.12.2 | minecraft.jar | None | | LCH | mcp | 9.42 | minecraft.jar | None | | LCH | FML | 8.0.99.99 | forge-1.12.2-14.23.5.2838.jar | e3c3d50c7c986df74c645c0ac54639741c90a557 | | LCH | forge | 14.23.5.2838 | forge-1.12.2-14.23.5.2838.jar | e3c3d50c7c986df74c645c0ac54639741c90a557 | | LCH | opencomputers|core | 1.7.5.192 | minecraft.jar | None | | LCH | actuallyadditions | 1.12.2-r151 | ActuallyAdditions-1.12.2-r151.jar | None | | LCH | applecore | 3.2.0 | AppleCore-mc1.12.2-3.2.0.jar | None | | LCH | architecturecraft | @VERSION@ | architecturecraft-1.12-3.98.jar | None | | LCH | autowalk | 1.1 | autowalk-1.12.2-1.0.jar | None | | LCH | backpack | 3.0.2 | backpack-3.0.2-1.12.2.jar | None | | LCH | chameleon | 1.12-4.1.3 | Chameleon-1.12-4.1.3.jar | None | | LCH | codechickenlib | 3.2.3.358 | CodeChickenLib-1.12.2-3.2.3.358-universal.jar | f1850c39b2516232a2108a7bd84d1cb5df93b261 | | LCH | chickenchunks | 2.4.2.74 | ChickenChunks-1.12.2-2.4.2.74-universal.jar | f1850c39b2516232a2108a7bd84d1cb5df93b261 | | LCH | chiselsandbits | 14.22 | Chisels-and-Bits-Mod-1.12.2.jar | None | | LCH | clockhud | 1.4.0 | ClockHUD-1.12-1.4.0.jar | None | | LCH | redstoneflux | 2.1.0 | RedstoneFlux-1.12-2.1.0.6-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | cofhcore | 4.6.3 | CoFHCore-1.12.2-4.6.3.27-universal.jar | None | | LCH | cofhworld | 1.3.1 | CoFHWorld-1.12.2-1.3.1.7-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | comforts | 1.4.1.2 | comforts-1.12.2-1.4.1.2.jar | 5d5b8aee896a4f5ea3f3114784742662a67ad32f | | LCH | corail_pillar_extension_chisel | 4.0.0 | corail_pillar_extension_chisel-4.0.0-1.12.jar | None | | LCH | jei | 4.15.0.292 | jei_1.12.2-4.15.0.292.jar | None | | LCH | forestry | 5.8.0.311 | Forestry-Mod-1.12.2.jar | None | | LCH | corail_pillar_extension_forestry | 4.0.0 | corail_pillar_extension_forestry-4.0.0-1.12.jar | None | | LCH | extendedrenderer | v1.0 | coroutil-1.12.1-1.2.32.jar | None | | LCH | coroutil | 1.12.1-1.2.32 | coroutil-1.12.1-1.2.32.jar | None | | LCH | configmod | v1.0 | coroutil-1.12.1-1.2.32.jar | None | | LCH | cyclicmagic | 1.19.18 | Cyclic-1.12.2-1.19.18.jar | None | | LCH | movingworld | 1.12-2.337 | movingworld-1.12-2.337-full.jar | None | | LCH | davincisvessels | @DVESSELSVER@ | Davincis-Vessels-Mod-1.12.2.jar | None | | LCH | ptrmodellib | 1.0.4 | PTRLib-1.0.4.jar | None | | LCH | props | 2.6.3 | Decocraft-2.6.3_1.12.2.jar | None | | LCH | llibrary | 1.7.19 | llibrary-1.7.19-1.12.2.jar | b9f30a813bee3b9dd5652c460310cfcd54f6b7ec | | LCH | dragonmounts | 1.12.2-1.4.4 | DragonMounts-1.12.2-1.4.4.jar | None | | LCH | orelib | 3.6.0.1 | OreLib-1.12.2-3.6.0.1.jar | 7a2128d395ad96ceb9d9030fbd41d035b435753a | | LCH | dsurround | 3.6.0.2 | DynamicSurroundings-1.12.2-3.6.0.2.jar | 7a2128d395ad96ceb9d9030fbd41d035b435753a | | LCH | easyretrogen | 5.0.1 | EasyRetrogen-1.12.2-5.0.1-universal.jar | None | | LCH | endercore | 1.12.2-0.5.37 | EnderCore-1.12.2.jar | None | | LCH | thermalfoundation | 2.6.3 | ThermalFoundation-1.12.2-2.6.3.27-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | thermalexpansion | 5.5.4 | ThermalExpansion-1.12.2-5.5.4.43-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | enderio | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationtic | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiobase | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduits | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsappliedenergistics | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsopencomputers | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsrefinedstorage | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationforestry | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationticlate | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiomachines | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiopowertools | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | opencomputers | 1.7.5.192 | OpenComputers-MC1.12.2-1.7.5.192.jar | None | | LCH | zerocore | 1.12.2-0.1.2.8 | zerocore-1.12.2-0.1.2.8.jar | None | | LCH | bigreactors | 1.12.2-0.4.5.67 | ExtremeReactors-1.12.2-0.4.5.67.jar | None | | LCH | cfm | 6.3.1 | furniture-6.3.1-1.12.2.jar | None | | LCH | healthhungertweaks | 3.0.1 | Health-and-Hunger-Tweaks-Mod-1.12.2.jar | None | | LCH | mts | 15.5.0 | Immersive+Vehicles-1.12.2-15.5.0.jar | None | | LCE | railcraft | 12.0.0-beta-5 | railcraft-12.0.0-beta-5.jar | a0c255ac501b2749537d5824bb0f0588bf0320fa | | LC | immersiveengineering | 0.12-89 | Immersive-Engineering-Mod-1.12.2.jar | 4cb49fcde3b43048c9889e0a3d083225da926334 | | LC | immersivepetroleum | 1.1.9 | Immersive-Petroleum-Mod-1.12.2.jar | None | | LC | trackapi | 1.1 | TrackAPI-1.1_1.12.jar | None | | LC | immersiverailroading | 1.4.0 | Immersive-Railroading-Mod-1.12.2.jar | None | | LC | journeymap | 1.12.2-5.5.3 | JourneyMap-1.12.2.jar | None || LC | landlust | 1.12.2-0.3.6 | Landlust-Furniture-Mod-1.12.2.jar | None | | LC | longerdays | 1.0.4 | longerdays-1.0.4.jar | None | | LC | radixcore | 1.12.x-2.2.1 | RadixCore-1.12.x-2.2.1-universal.jar | None | | LC | mca | 1.12.2-5.3.1 | Minecraft-Comes-Alive-Mod-1.12.2.jar | None | | LC | modernlamps | 1.0.3 | ModernLights-1.0.3_1.12.jar | None | | LC | mtsofficialpack | 12.0.0 | MTS_Official_Pack_V12.jar | None | | LC | mw | 0.4.1 | mw_2.0-0.4.1_mc1.12.2.jar | None | | LC | stacksize | 1.1 | stacksize-1.12.2-1.0.jar | None | | LC | storagedrawers | 1.12.2-5.4.0 | StorageDrawers-1.12.2-5.4.0.jar | None | | LC | storagedrawersextra | @VERSION@ | StorageDrawersExtras-1.12-3.1.0.jar | None | | LC | thermaldynamics | 2.5.5 | ThermalDynamics-1.12.2-2.5.5.21-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LC | treechopper | 1.2.4 | TreeChopper-1.12.2-1.2.4.jar | None | | LC | weather2 | 1.12.1-2.6.12 | weather2-1.12.1-2.6.12.jar | None | | LC | corail_pillar | 4.2.1 | corail_pillar-4.2.1-1.12.jar | None | Loaded coremods (and transformers): llibrary (llibrary-core-1.0.11-1.12.2.jar) net.ilexiconn.llibrary.server.core.plugin.LLibraryTransformer net.ilexiconn.llibrary.server.core.patcher.LLibraryRuntimePatcher IELoadingPlugin (ImmersiveEngineering-core-0.12-89.jar) blusunrize.immersiveengineering.common.asm.IEClassTransformer weaponlib (mw_2.0-0.4.1_mc1.12.2.jar) com.vicmatskiv.weaponlib.core.WeaponlibClassTransformer TransformerLoader (OpenComputers-MC1.12.2-1.7.5.192.jar) li.cil.oc.common.asm.ClassTransformer AppleCore (AppleCore-mc1.12.2-3.2.0.jar) squeek.applecore.asm.TransformerModuleHandler EnderCorePlugin (EnderCore-1.12.2.jar) com.enderio.core.common.transform.EnderCoreTransformer com.enderio.core.common.transform.SimpleMixinPatcher TransformLoader (DynamicSurroundings-1.12.2-3.6.0.2.jar) GL info: ' Vendor: 'ATI Technologies Inc.' Version: '4.6.13571 Compatibility Profile Context 19.9.2 26.20.13003.1007' Renderer: 'AMD Radeon (TM) R9 390 Series' OptiFine Version: OptiFine_1.12.2_HD_U_F4 OptiFine Build: 20191023-234904 Render Distance Chunks: 8 Mipmaps: 4 Anisotropic Filtering: 1 Antialiasing: 0 Multitexture: false Shaders: SEUS-v11.0.zip OpenGlVersion: 4.6.13571 Compatibility Profile Context 19.9.2 26.20.13003.1007 OpenGlRenderer: AMD Radeon (TM) R9 390 Series OpenGlVendor: ATI Technologies Inc. CpuCount: 4 I understand through some on-line trouble shooting that there may be a resource conflict between minecraft and another mod. But I am not seeing that in the error report. It seems to be an issue with java. Any help would be appreciated, since I can not seem to run minecraft at all with railcraft.
1.0
java.lang.NullPointerException when loading railcraft with forge 14.23.5.2838 - **Description of the Bug** minecraft 1.12.2 using forge 14.23.5.2838 crashes an reports the following error while loading net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft) Caused by: java.lang.NullPointerException Please find the crash report below: ---- Minecraft Crash Report ---- WARNING: coremods are present: llibrary (llibrary-core-1.0.11-1.12.2.jar) IELoadingPlugin (ImmersiveEngineering-core-0.12-89.jar) weaponlib (mw_2.0-0.4.1_mc1.12.2.jar) TransformerLoader (OpenComputers-MC1.12.2-1.7.5.192.jar) AppleCore (AppleCore-mc1.12.2-3.2.0.jar) EnderCorePlugin (EnderCore-1.12.2.jar) TransformLoader (DynamicSurroundings-1.12.2-3.6.0.2.jar) Contact their authors BEFORE contacting forge // Surprise! Haha. Well, this is awkward. Time: 1/24/20 10:01 PM Description: There was a severe problem during mod loading that has caused the game to fail net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft) Caused by: java.lang.NullPointerException at mods.railcraft.common.fluids.RailcraftFluids$Def.initBlock(RailcraftFluids.java:96) at mods.railcraft.common.fluids.RailcraftFluids$Def.access$500(RailcraftFluids.java:78) at mods.railcraft.common.fluids.RailcraftFluids.init(RailcraftFluids.java:130) at mods.railcraft.common.fluids.RailcraftFluids.preInitFluids(RailcraftFluids.java:118) at mods.railcraft.common.modules.ModuleCore$1.construction(ModuleCore.java:103) at mods.railcraft.common.modules.RailcraftModulePayload$BaseModuleEventHandler.construction(RailcraftModulePayload.java:71) at mods.railcraft.common.modules.RailcraftModuleManager$Stage$1.passToModule(RailcraftModuleManager.java:293) at mods.railcraft.common.modules.RailcraftModuleManager.processStage(RailcraftModuleManager.java:250) at mods.railcraft.common.modules.RailcraftModuleManager.preInit(RailcraftModuleManager.java:198) at mods.railcraft.common.core.Railcraft.preInit(Railcraft.java:116) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at net.minecraftforge.fml.common.FMLModContainer.handleModStateEvent(FMLModContainer.java:637) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:91) at com.google.common.eventbus.Subscriber$SynchronizedSubscriber.invokeSubscriberMethod(Subscriber.java:150) at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:76) at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399) at com.google.common.eventbus.Subscriber.dispatchEvent(Subscriber.java:71) at com.google.common.eventbus.Dispatcher$PerThreadQueuedDispatcher.dispatch(Dispatcher.java:116) at com.google.common.eventbus.EventBus.post(EventBus.java:217) at net.minecraftforge.fml.common.LoadController.sendEventToModContainer(LoadController.java:219) at net.minecraftforge.fml.common.LoadController.propogateStateMessage(LoadController.java:197) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:91) at com.google.common.eventbus.Subscriber$SynchronizedSubscriber.invokeSubscriberMethod(Subscriber.java:150) at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:76) at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:399) at com.google.common.eventbus.Subscriber.dispatchEvent(Subscriber.java:71) at com.google.common.eventbus.Dispatcher$PerThreadQueuedDispatcher.dispatch(Dispatcher.java:116) at com.google.common.eventbus.EventBus.post(EventBus.java:217) at net.minecraftforge.fml.common.LoadController.distributeStateMessage(LoadController.java:136) at net.minecraftforge.fml.common.Loader.preinitializeMods(Loader.java:627) at net.minecraftforge.fml.client.FMLClientHandler.beginMinecraftLoading(FMLClientHandler.java:252) at net.minecraft.client.Minecraft.func_71384_a(Minecraft.java:467) at net.minecraft.client.Minecraft.func_99999_d(Minecraft.java:378) at net.minecraft.client.main.Main.main(SourceFile:123) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at net.minecraft.launchwrapper.Launch.launch(Launch.java:135) at net.minecraft.launchwrapper.Launch.main(Launch.java:28) A detailed walkthrough of the error, its code path and all known details is as follows: --------------------------------------------------------------------------------------- -- System Details -- Details: Minecraft Version: 1.12.2 Operating System: Windows 10 (amd64) version 10.0 Java Version: 1.8.0_51, Oracle Corporation Java VM Version: Java HotSpot(TM) 64-Bit Server VM (mixed mode), Oracle Corporation Memory: 4661590680 bytes (4445 MB) / 6878658560 bytes (6560 MB) up to 17179869184 bytes (16384 MB) JVM Flags: 8 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xmx16G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=16M IntCache: cache: 0, tcache: 0, allocated: 0, tallocated: 0 FML: MCP 9.42 Powered by Forge 14.23.5.2838 Optifine OptiFine_1.12.2_HD_U_F4 76 mods loaded, 76 mods active States: 'U' = Unloaded 'L' = Loaded 'C' = Constructed 'H' = Pre-initialized 'I' = Initialized 'J' = Post-initialized 'A' = Available 'D' = Disabled 'E' = Errored | State | ID | Version | Source | Signature | |:----- |:--------------------------------- |:--------------- |:----------------------------------------------- |:---------------------------------------- | | LCH | minecraft | 1.12.2 | minecraft.jar | None | | LCH | mcp | 9.42 | minecraft.jar | None | | LCH | FML | 8.0.99.99 | forge-1.12.2-14.23.5.2838.jar | e3c3d50c7c986df74c645c0ac54639741c90a557 | | LCH | forge | 14.23.5.2838 | forge-1.12.2-14.23.5.2838.jar | e3c3d50c7c986df74c645c0ac54639741c90a557 | | LCH | opencomputers|core | 1.7.5.192 | minecraft.jar | None | | LCH | actuallyadditions | 1.12.2-r151 | ActuallyAdditions-1.12.2-r151.jar | None | | LCH | applecore | 3.2.0 | AppleCore-mc1.12.2-3.2.0.jar | None | | LCH | architecturecraft | @VERSION@ | architecturecraft-1.12-3.98.jar | None | | LCH | autowalk | 1.1 | autowalk-1.12.2-1.0.jar | None | | LCH | backpack | 3.0.2 | backpack-3.0.2-1.12.2.jar | None | | LCH | chameleon | 1.12-4.1.3 | Chameleon-1.12-4.1.3.jar | None | | LCH | codechickenlib | 3.2.3.358 | CodeChickenLib-1.12.2-3.2.3.358-universal.jar | f1850c39b2516232a2108a7bd84d1cb5df93b261 | | LCH | chickenchunks | 2.4.2.74 | ChickenChunks-1.12.2-2.4.2.74-universal.jar | f1850c39b2516232a2108a7bd84d1cb5df93b261 | | LCH | chiselsandbits | 14.22 | Chisels-and-Bits-Mod-1.12.2.jar | None | | LCH | clockhud | 1.4.0 | ClockHUD-1.12-1.4.0.jar | None | | LCH | redstoneflux | 2.1.0 | RedstoneFlux-1.12-2.1.0.6-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | cofhcore | 4.6.3 | CoFHCore-1.12.2-4.6.3.27-universal.jar | None | | LCH | cofhworld | 1.3.1 | CoFHWorld-1.12.2-1.3.1.7-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | comforts | 1.4.1.2 | comforts-1.12.2-1.4.1.2.jar | 5d5b8aee896a4f5ea3f3114784742662a67ad32f | | LCH | corail_pillar_extension_chisel | 4.0.0 | corail_pillar_extension_chisel-4.0.0-1.12.jar | None | | LCH | jei | 4.15.0.292 | jei_1.12.2-4.15.0.292.jar | None | | LCH | forestry | 5.8.0.311 | Forestry-Mod-1.12.2.jar | None | | LCH | corail_pillar_extension_forestry | 4.0.0 | corail_pillar_extension_forestry-4.0.0-1.12.jar | None | | LCH | extendedrenderer | v1.0 | coroutil-1.12.1-1.2.32.jar | None | | LCH | coroutil | 1.12.1-1.2.32 | coroutil-1.12.1-1.2.32.jar | None | | LCH | configmod | v1.0 | coroutil-1.12.1-1.2.32.jar | None | | LCH | cyclicmagic | 1.19.18 | Cyclic-1.12.2-1.19.18.jar | None | | LCH | movingworld | 1.12-2.337 | movingworld-1.12-2.337-full.jar | None | | LCH | davincisvessels | @DVESSELSVER@ | Davincis-Vessels-Mod-1.12.2.jar | None | | LCH | ptrmodellib | 1.0.4 | PTRLib-1.0.4.jar | None | | LCH | props | 2.6.3 | Decocraft-2.6.3_1.12.2.jar | None | | LCH | llibrary | 1.7.19 | llibrary-1.7.19-1.12.2.jar | b9f30a813bee3b9dd5652c460310cfcd54f6b7ec | | LCH | dragonmounts | 1.12.2-1.4.4 | DragonMounts-1.12.2-1.4.4.jar | None | | LCH | orelib | 3.6.0.1 | OreLib-1.12.2-3.6.0.1.jar | 7a2128d395ad96ceb9d9030fbd41d035b435753a | | LCH | dsurround | 3.6.0.2 | DynamicSurroundings-1.12.2-3.6.0.2.jar | 7a2128d395ad96ceb9d9030fbd41d035b435753a | | LCH | easyretrogen | 5.0.1 | EasyRetrogen-1.12.2-5.0.1-universal.jar | None | | LCH | endercore | 1.12.2-0.5.37 | EnderCore-1.12.2.jar | None | | LCH | thermalfoundation | 2.6.3 | ThermalFoundation-1.12.2-2.6.3.27-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | thermalexpansion | 5.5.4 | ThermalExpansion-1.12.2-5.5.4.43-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LCH | enderio | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationtic | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiobase | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduits | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsappliedenergistics | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsopencomputers | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderioconduitsrefinedstorage | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationforestry | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiointegrationticlate | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiomachines | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | enderiopowertools | 5.0.25 | Ender-IO-Mod-1.12.2.jar | None | | LCH | opencomputers | 1.7.5.192 | OpenComputers-MC1.12.2-1.7.5.192.jar | None | | LCH | zerocore | 1.12.2-0.1.2.8 | zerocore-1.12.2-0.1.2.8.jar | None | | LCH | bigreactors | 1.12.2-0.4.5.67 | ExtremeReactors-1.12.2-0.4.5.67.jar | None | | LCH | cfm | 6.3.1 | furniture-6.3.1-1.12.2.jar | None | | LCH | healthhungertweaks | 3.0.1 | Health-and-Hunger-Tweaks-Mod-1.12.2.jar | None | | LCH | mts | 15.5.0 | Immersive+Vehicles-1.12.2-15.5.0.jar | None | | LCE | railcraft | 12.0.0-beta-5 | railcraft-12.0.0-beta-5.jar | a0c255ac501b2749537d5824bb0f0588bf0320fa | | LC | immersiveengineering | 0.12-89 | Immersive-Engineering-Mod-1.12.2.jar | 4cb49fcde3b43048c9889e0a3d083225da926334 | | LC | immersivepetroleum | 1.1.9 | Immersive-Petroleum-Mod-1.12.2.jar | None | | LC | trackapi | 1.1 | TrackAPI-1.1_1.12.jar | None | | LC | immersiverailroading | 1.4.0 | Immersive-Railroading-Mod-1.12.2.jar | None | | LC | journeymap | 1.12.2-5.5.3 | JourneyMap-1.12.2.jar | None || LC | landlust | 1.12.2-0.3.6 | Landlust-Furniture-Mod-1.12.2.jar | None | | LC | longerdays | 1.0.4 | longerdays-1.0.4.jar | None | | LC | radixcore | 1.12.x-2.2.1 | RadixCore-1.12.x-2.2.1-universal.jar | None | | LC | mca | 1.12.2-5.3.1 | Minecraft-Comes-Alive-Mod-1.12.2.jar | None | | LC | modernlamps | 1.0.3 | ModernLights-1.0.3_1.12.jar | None | | LC | mtsofficialpack | 12.0.0 | MTS_Official_Pack_V12.jar | None | | LC | mw | 0.4.1 | mw_2.0-0.4.1_mc1.12.2.jar | None | | LC | stacksize | 1.1 | stacksize-1.12.2-1.0.jar | None | | LC | storagedrawers | 1.12.2-5.4.0 | StorageDrawers-1.12.2-5.4.0.jar | None | | LC | storagedrawersextra | @VERSION@ | StorageDrawersExtras-1.12-3.1.0.jar | None | | LC | thermaldynamics | 2.5.5 | ThermalDynamics-1.12.2-2.5.5.21-universal.jar | 8a6abf2cb9e141b866580d369ba6548732eff25f | | LC | treechopper | 1.2.4 | TreeChopper-1.12.2-1.2.4.jar | None | | LC | weather2 | 1.12.1-2.6.12 | weather2-1.12.1-2.6.12.jar | None | | LC | corail_pillar | 4.2.1 | corail_pillar-4.2.1-1.12.jar | None | Loaded coremods (and transformers): llibrary (llibrary-core-1.0.11-1.12.2.jar) net.ilexiconn.llibrary.server.core.plugin.LLibraryTransformer net.ilexiconn.llibrary.server.core.patcher.LLibraryRuntimePatcher IELoadingPlugin (ImmersiveEngineering-core-0.12-89.jar) blusunrize.immersiveengineering.common.asm.IEClassTransformer weaponlib (mw_2.0-0.4.1_mc1.12.2.jar) com.vicmatskiv.weaponlib.core.WeaponlibClassTransformer TransformerLoader (OpenComputers-MC1.12.2-1.7.5.192.jar) li.cil.oc.common.asm.ClassTransformer AppleCore (AppleCore-mc1.12.2-3.2.0.jar) squeek.applecore.asm.TransformerModuleHandler EnderCorePlugin (EnderCore-1.12.2.jar) com.enderio.core.common.transform.EnderCoreTransformer com.enderio.core.common.transform.SimpleMixinPatcher TransformLoader (DynamicSurroundings-1.12.2-3.6.0.2.jar) GL info: ' Vendor: 'ATI Technologies Inc.' Version: '4.6.13571 Compatibility Profile Context 19.9.2 26.20.13003.1007' Renderer: 'AMD Radeon (TM) R9 390 Series' OptiFine Version: OptiFine_1.12.2_HD_U_F4 OptiFine Build: 20191023-234904 Render Distance Chunks: 8 Mipmaps: 4 Anisotropic Filtering: 1 Antialiasing: 0 Multitexture: false Shaders: SEUS-v11.0.zip OpenGlVersion: 4.6.13571 Compatibility Profile Context 19.9.2 26.20.13003.1007 OpenGlRenderer: AMD Radeon (TM) R9 390 Series OpenGlVendor: ATI Technologies Inc. CpuCount: 4 I understand through some on-line trouble shooting that there may be a resource conflict between minecraft and another mod. But I am not seeing that in the error report. It seems to be an issue with java. Any help would be appreciated, since I can not seem to run minecraft at all with railcraft.
non_process
java lang nullpointerexception when loading railcraft with forge description of the bug minecraft using forge crashes an reports the following error while loading net minecraftforge fml common loaderexceptionmodcrash caught exception from railcraft railcraft caused by java lang nullpointerexception please find the crash report below minecraft crash report warning coremods are present llibrary llibrary core jar ieloadingplugin immersiveengineering core jar weaponlib mw jar transformerloader opencomputers jar applecore applecore jar endercoreplugin endercore jar transformloader dynamicsurroundings jar contact their authors before contacting forge surprise haha well this is awkward time pm description there was a severe problem during mod loading that has caused the game to fail net minecraftforge fml common loaderexceptionmodcrash caught exception from railcraft railcraft caused by java lang nullpointerexception at mods railcraft common fluids railcraftfluids def initblock railcraftfluids java at mods railcraft common fluids railcraftfluids def access railcraftfluids java at mods railcraft common fluids railcraftfluids init railcraftfluids java at mods railcraft common fluids railcraftfluids preinitfluids railcraftfluids java at mods railcraft common modules modulecore construction modulecore java at mods railcraft common modules railcraftmodulepayload basemoduleeventhandler construction railcraftmodulepayload java at mods railcraft common modules railcraftmodulemanager stage passtomodule railcraftmodulemanager java at mods railcraft common modules railcraftmodulemanager processstage railcraftmodulemanager java at mods railcraft common modules railcraftmodulemanager preinit railcraftmodulemanager java at mods railcraft common core railcraft preinit railcraft java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at net minecraftforge fml common fmlmodcontainer handlemodstateevent fmlmodcontainer java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com google common eventbus subscriber invokesubscribermethod subscriber java at com google common eventbus subscriber synchronizedsubscriber invokesubscribermethod subscriber java at com google common eventbus subscriber run subscriber java at com google common util concurrent moreexecutors directexecutor execute moreexecutors java at com google common eventbus subscriber dispatchevent subscriber java at com google common eventbus dispatcher perthreadqueueddispatcher dispatch dispatcher java at com google common eventbus eventbus post eventbus java at net minecraftforge fml common loadcontroller sendeventtomodcontainer loadcontroller java at net minecraftforge fml common loadcontroller propogatestatemessage loadcontroller java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com google common eventbus subscriber invokesubscribermethod subscriber java at com google common eventbus subscriber synchronizedsubscriber invokesubscribermethod subscriber java at com google common eventbus subscriber run subscriber java at com google common util concurrent moreexecutors directexecutor execute moreexecutors java at com google common eventbus subscriber dispatchevent subscriber java at com google common eventbus dispatcher perthreadqueueddispatcher dispatch dispatcher java at com google common eventbus eventbus post eventbus java at net minecraftforge fml common loadcontroller distributestatemessage loadcontroller java at net minecraftforge fml common loader preinitializemods loader java at net minecraftforge fml client fmlclienthandler beginminecraftloading fmlclienthandler java at net minecraft client minecraft func a minecraft java at net minecraft client minecraft func d minecraft java at net minecraft client main main main sourcefile at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at net minecraft launchwrapper launch launch launch java at net minecraft launchwrapper launch main launch java a detailed walkthrough of the error its code path and all known details is as follows system details details minecraft version operating system windows version java version oracle corporation java vm version java hotspot tm bit server vm mixed mode oracle corporation memory bytes mb bytes mb up to bytes mb jvm flags total xx heapdumppath mojangtricksinteldriversforperformance javaw exe minecraft exe heapdump xx unlockexperimentalvmoptions xx xx xx xx maxgcpausemillis xx intcache cache tcache allocated tallocated fml mcp powered by forge optifine optifine hd u mods loaded mods active states u unloaded l loaded c constructed h pre initialized i initialized j post initialized a available d disabled e errored state id version source signature lch minecraft minecraft jar none lch mcp minecraft jar none lch fml forge jar lch forge forge jar lch opencomputers core minecraft jar none lch actuallyadditions actuallyadditions jar none lch applecore applecore jar none lch architecturecraft version architecturecraft jar none lch autowalk autowalk jar none lch backpack backpack jar none lch chameleon chameleon jar none lch codechickenlib codechickenlib universal jar lch chickenchunks chickenchunks universal jar lch chiselsandbits chisels and bits mod jar none lch clockhud clockhud jar none lch redstoneflux redstoneflux universal jar lch cofhcore cofhcore universal jar none lch cofhworld cofhworld universal jar lch comforts comforts jar lch corail pillar extension chisel corail pillar extension chisel jar none lch jei jei jar none lch forestry forestry mod jar none lch corail pillar extension forestry corail pillar extension forestry jar none lch extendedrenderer coroutil jar none lch coroutil coroutil jar none lch configmod coroutil jar none lch cyclicmagic cyclic jar none lch movingworld movingworld full jar none lch davincisvessels dvesselsver davincis vessels mod jar none lch ptrmodellib ptrlib jar none lch props decocraft jar none lch llibrary llibrary jar lch dragonmounts dragonmounts jar none lch orelib orelib jar lch dsurround dynamicsurroundings jar lch easyretrogen easyretrogen universal jar none lch endercore endercore jar none lch thermalfoundation thermalfoundation universal jar lch thermalexpansion thermalexpansion universal jar lch enderio ender io mod jar none lch enderiointegrationtic ender io mod jar none lch enderiobase ender io mod jar none lch enderioconduits ender io mod jar none lch enderioconduitsappliedenergistics ender io mod jar none lch enderioconduitsopencomputers ender io mod jar none lch enderioconduitsrefinedstorage ender io mod jar none lch enderiointegrationforestry ender io mod jar none lch enderiointegrationticlate ender io mod jar none lch enderiomachines ender io mod jar none lch enderiopowertools ender io mod jar none lch opencomputers opencomputers jar none lch zerocore zerocore jar none lch bigreactors extremereactors jar none lch cfm furniture jar none lch healthhungertweaks health and hunger tweaks mod jar none lch mts immersive vehicles jar none lce railcraft beta railcraft beta jar lc immersiveengineering immersive engineering mod jar lc immersivepetroleum immersive petroleum mod jar none lc trackapi trackapi jar none lc immersiverailroading immersive railroading mod jar none lc journeymap journeymap jar none lc landlust landlust furniture mod jar none lc longerdays longerdays jar none lc radixcore x radixcore x universal jar none lc mca minecraft comes alive mod jar none lc modernlamps modernlights jar none lc mtsofficialpack mts official pack jar none lc mw mw jar none lc stacksize stacksize jar none lc storagedrawers storagedrawers jar none lc storagedrawersextra version storagedrawersextras jar none lc thermaldynamics thermaldynamics universal jar lc treechopper treechopper jar none lc jar none lc corail pillar corail pillar jar none loaded coremods and transformers llibrary llibrary core jar net ilexiconn llibrary server core plugin llibrarytransformer net ilexiconn llibrary server core patcher llibraryruntimepatcher ieloadingplugin immersiveengineering core jar blusunrize immersiveengineering common asm ieclasstransformer weaponlib mw jar com vicmatskiv weaponlib core weaponlibclasstransformer transformerloader opencomputers jar li cil oc common asm classtransformer applecore applecore jar squeek applecore asm transformermodulehandler endercoreplugin endercore jar com enderio core common transform endercoretransformer com enderio core common transform simplemixinpatcher transformloader dynamicsurroundings jar gl info vendor ati technologies inc version compatibility profile context renderer amd radeon tm series optifine version optifine hd u optifine build render distance chunks mipmaps anisotropic filtering antialiasing multitexture false shaders seus zip openglversion compatibility profile context openglrenderer amd radeon tm series openglvendor ati technologies inc cpucount i understand through some on line trouble shooting that there may be a resource conflict between minecraft and another mod but i am not seeing that in the error report it seems to be an issue with java any help would be appreciated since i can not seem to run minecraft at all with railcraft
0
15,891
20,075,037,828
IssuesEvent
2022-02-04 11:43:39
climatepolicyradar/navigator
https://api.github.com/repos/climatepolicyradar/navigator
opened
Identify main document language
Document processing
Following detection of passage level language, Navigator should identify the main language for the document and store this in the database. The main language would be considered to be the language detected having the most passages in the document.
1.0
Identify main document language - Following detection of passage level language, Navigator should identify the main language for the document and store this in the database. The main language would be considered to be the language detected having the most passages in the document.
process
identify main document language following detection of passage level language navigator should identify the main language for the document and store this in the database the main language would be considered to be the language detected having the most passages in the document
1
52,491
12,974,002,395
IssuesEvent
2020-07-21 14:49:39
infor-design/enterprise
https://api.github.com/repos/infor-design/enterprise
closed
Date Format with no separator not supported in masking
[3] team: appbuilder type: bug :bug:
**Describe the bug** I want to use format yyyyMMdd in input-masking but I can not write anything in the textbox with this format. **To Reproduce** Steps to reproduce the behavior: 1. Have a textbox 2. Add a mask such as yyyyMMdd 3. Try to write anything in the textbox **Expected behavior** The input should follow the mask **Version** 4.27.4 **Platform** - Infor App Builder
1.0
Date Format with no separator not supported in masking - **Describe the bug** I want to use format yyyyMMdd in input-masking but I can not write anything in the textbox with this format. **To Reproduce** Steps to reproduce the behavior: 1. Have a textbox 2. Add a mask such as yyyyMMdd 3. Try to write anything in the textbox **Expected behavior** The input should follow the mask **Version** 4.27.4 **Platform** - Infor App Builder
non_process
date format with no separator not supported in masking describe the bug i want to use format yyyymmdd in input masking but i can not write anything in the textbox with this format to reproduce steps to reproduce the behavior have a textbox add a mask such as yyyymmdd try to write anything in the textbox expected behavior the input should follow the mask version platform infor app builder
0
15,105
18,844,149,697
IssuesEvent
2021-11-11 13:09:06
DSE511-Project3-Team/DSE511-Project-3-Code-Repo
https://api.github.com/repos/DSE511-Project3-Team/DSE511-Project-3-Code-Repo
opened
Exploratory Data Analysis (EDA)
Preprocess
Analyze accident data sets to summarize its main characteristics.
1.0
Exploratory Data Analysis (EDA) - Analyze accident data sets to summarize its main characteristics.
process
exploratory data analysis eda analyze accident data sets to summarize its main characteristics
1
8,058
11,222,198,018
IssuesEvent
2020-01-07 19:36:53
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
opened
Unflake Spanner integration tests
api: spanner type: process
Flakes for multiple spanner integration tests over the past day: TestIntegration_DML: https://sponge.corp.google.com/target?id=88c22d5e-bb5c-4965-acb8-021c8346d4cc&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS TestBatchDML_TwoStatements: https://sponge.corp.google.com/target?id=d7c51dd9-ba08-47f6-9305-2ad6438e0fc0&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go112&searchFor=&show=ALL&sortBy=STATUS TestIntegration_QueryExpressions: https://sponge.corp.google.com/target?id=cf295096-310f-4d5d-b806-a07a5bab09dd&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS multiple tests failed: https://sponge.corp.google.com/target?id=6106295f-6a4b-403f-b637-091ea191529b&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS Please fix these (and/or disable flaky tests for the time being if there's not a quick fix).
1.0
Unflake Spanner integration tests - Flakes for multiple spanner integration tests over the past day: TestIntegration_DML: https://sponge.corp.google.com/target?id=88c22d5e-bb5c-4965-acb8-021c8346d4cc&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS TestBatchDML_TwoStatements: https://sponge.corp.google.com/target?id=d7c51dd9-ba08-47f6-9305-2ad6438e0fc0&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go112&searchFor=&show=ALL&sortBy=STATUS TestIntegration_QueryExpressions: https://sponge.corp.google.com/target?id=cf295096-310f-4d5d-b806-a07a5bab09dd&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS multiple tests failed: https://sponge.corp.google.com/target?id=6106295f-6a4b-403f-b637-091ea191529b&target=cloud-devrel/client-libraries/go/google-cloud-go/continuous/go113&searchFor=&show=ALL&sortBy=STATUS Please fix these (and/or disable flaky tests for the time being if there's not a quick fix).
process
unflake spanner integration tests flakes for multiple spanner integration tests over the past day testintegration dml testbatchdml twostatements testintegration queryexpressions multiple tests failed please fix these and or disable flaky tests for the time being if there s not a quick fix
1
5,891
8,709,157,532
IssuesEvent
2018-12-06 13:10:36
aiidateam/aiida_core
https://api.github.com/repos/aiidateam/aiida_core
closed
Remove entry point group from the `process_type` column
priority/important requires discussion topic/DatabaseSchemaAndOptimization topic/JobCalculationAndProcess
Currently, if the process has an associated entry point, the entry point string will be stored in the `process_type` column of the `Node` instance. The format of this string is `{entry_point_group}:{entry_point_name}`. However, since the group to which the entry point name should belong is already contained in the `type` of the node, this is redundant information. For example, if the entry point name is `quantumespresso.pw` and the type is `node.process.calculation.calcjob.CalcJobNode.`, we know the corresponding entry point group is `aiida.calculations`.
1.0
Remove entry point group from the `process_type` column - Currently, if the process has an associated entry point, the entry point string will be stored in the `process_type` column of the `Node` instance. The format of this string is `{entry_point_group}:{entry_point_name}`. However, since the group to which the entry point name should belong is already contained in the `type` of the node, this is redundant information. For example, if the entry point name is `quantumespresso.pw` and the type is `node.process.calculation.calcjob.CalcJobNode.`, we know the corresponding entry point group is `aiida.calculations`.
process
remove entry point group from the process type column currently if the process has an associated entry point the entry point string will be stored in the process type column of the node instance the format of this string is entry point group entry point name however since the group to which the entry point name should belong is already contained in the type of the node this is redundant information for example if the entry point name is quantumespresso pw and the type is node process calculation calcjob calcjobnode we know the corresponding entry point group is aiida calculations
1
1,518
4,111,068,405
IssuesEvent
2016-06-07 03:21:21
nodejs/node
https://api.github.com/repos/nodejs/node
closed
child-process: data loss with piped stdout
child_process duplicate stream
* **Version**: v6.2.1 * **Platform**: Linux rachmaninoff 4.5.3-1-ARCH #1 SMP PREEMPT Sat May 7 20:43:57 CEST 2016 x86_64 GNU/Linux * **Subsystem**: child-process When the stdout of a spawned process is piped into a writable stream that doesn't read fast enough, and the spawned process exits, node will put it into flowing mode and data will be lost. I have the following minimal testcase that pipes a simple child process to a PassThrough stream, and only after the process has exited the through stream is piped to node's stdout. * **Expected output**: All numbers from 1 to 36000 * **Actual output**: On my machine it stops at around 32000 ```javascript const spawn = require('child_process').spawn; const stream = require('stream'); const through = new stream.PassThrough(); const p = spawn('seq', [ '36000' ]); p.on('exit', function(code) { setImmediate(function() { through.pipe(process.stdout); }); }); p.stdout.pipe(through); ```
1.0
child-process: data loss with piped stdout - * **Version**: v6.2.1 * **Platform**: Linux rachmaninoff 4.5.3-1-ARCH #1 SMP PREEMPT Sat May 7 20:43:57 CEST 2016 x86_64 GNU/Linux * **Subsystem**: child-process When the stdout of a spawned process is piped into a writable stream that doesn't read fast enough, and the spawned process exits, node will put it into flowing mode and data will be lost. I have the following minimal testcase that pipes a simple child process to a PassThrough stream, and only after the process has exited the through stream is piped to node's stdout. * **Expected output**: All numbers from 1 to 36000 * **Actual output**: On my machine it stops at around 32000 ```javascript const spawn = require('child_process').spawn; const stream = require('stream'); const through = new stream.PassThrough(); const p = spawn('seq', [ '36000' ]); p.on('exit', function(code) { setImmediate(function() { through.pipe(process.stdout); }); }); p.stdout.pipe(through); ```
process
child process data loss with piped stdout version platform linux rachmaninoff arch smp preempt sat may cest gnu linux subsystem child process when the stdout of a spawned process is piped into a writable stream that doesn t read fast enough and the spawned process exits node will put it into flowing mode and data will be lost i have the following minimal testcase that pipes a simple child process to a passthrough stream and only after the process has exited the through stream is piped to node s stdout expected output all numbers from to actual output on my machine it stops at around javascript const spawn require child process spawn const stream require stream const through new stream passthrough const p spawn seq p on exit function code setimmediate function through pipe process stdout p stdout pipe through
1
18,580
24,562,626,440
IssuesEvent
2022-10-12 21:58:13
NEARWEEK/NEWS
https://api.github.com/repos/NEARWEEK/NEWS
closed
Launch new Github process
Process
## 🎉 Subtasks - [x] Onboard everyone to Github, make sure everyone has 2FA enabled - [x] Make sure all OKRs are reflected as milestones with sub-issues, labels & assignees - [x] Use Github issues & project board as the go-to-place for meetings & asynch coordination ## 🤼‍♂️ Reviewer @P3ter-NEARWEEK ## 🔗 Work doc(s) / inspirational links https://docs.google.com/spreadsheets/d/1W-3S00oOgVr3yj_scxtwJHr5OAAeLako1A4zXqG3pN8/edit#gid=111862931
1.0
Launch new Github process - ## 🎉 Subtasks - [x] Onboard everyone to Github, make sure everyone has 2FA enabled - [x] Make sure all OKRs are reflected as milestones with sub-issues, labels & assignees - [x] Use Github issues & project board as the go-to-place for meetings & asynch coordination ## 🤼‍♂️ Reviewer @P3ter-NEARWEEK ## 🔗 Work doc(s) / inspirational links https://docs.google.com/spreadsheets/d/1W-3S00oOgVr3yj_scxtwJHr5OAAeLako1A4zXqG3pN8/edit#gid=111862931
process
launch new github process 🎉 subtasks onboard everyone to github make sure everyone has enabled make sure all okrs are reflected as milestones with sub issues labels assignees use github issues project board as the go to place for meetings asynch coordination 🤼‍♂️ reviewer nearweek 🔗 work doc s inspirational links
1
9,370
12,374,180,682
IssuesEvent
2020-05-19 00:45:49
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
Max t.s. seems not calculated correctly
log-processing
Just noticed during some observation that there's a mismatch between max T.S. in a header row and ones presented in separate lines: ![wrong_ts](https://user-images.githubusercontent.com/1748792/78793488-c5545780-79b2-11ea-8a86-4e276d94fa58.jpg) There is 1.24m in head and subsequent lines hold much higher values i.e. 7.82m. I have verified the access log manually and there should be 72.874sec. May I ask you to point me to the section in code where this is calculated and I may take a look? Or if there is any other hint on this topic? Thank you in advance and keep up the good work!
1.0
Max t.s. seems not calculated correctly - Just noticed during some observation that there's a mismatch between max T.S. in a header row and ones presented in separate lines: ![wrong_ts](https://user-images.githubusercontent.com/1748792/78793488-c5545780-79b2-11ea-8a86-4e276d94fa58.jpg) There is 1.24m in head and subsequent lines hold much higher values i.e. 7.82m. I have verified the access log manually and there should be 72.874sec. May I ask you to point me to the section in code where this is calculated and I may take a look? Or if there is any other hint on this topic? Thank you in advance and keep up the good work!
process
max t s seems not calculated correctly just noticed during some observation that there s a mismatch between max t s in a header row and ones presented in separate lines there is in head and subsequent lines hold much higher values i e i have verified the access log manually and there should be may i ask you to point me to the section in code where this is calculated and i may take a look or if there is any other hint on this topic thank you in advance and keep up the good work
1
9,097
12,168,617,985
IssuesEvent
2020-04-27 12:58:50
Ghost-chu/QuickShop-Reremake
https://api.github.com/repos/Ghost-chu/QuickShop-Reremake
closed
[BUG]大部分玩家无法创建商店
Bug In Process Priority:Major
**Describe the bug** 新版本的quickshop似乎存在residence权限的检测问题,有时,拥有该领地权限的玩家无法创建领地,提示:没有权限:第三方插件 [{0}]。 当我将integration.residence.create修改为true时,创建商店还是提示没有权限,并且,箱子上的悬浮物会出现,每次创建都会出现一个悬浮物,并且会叠加起来,但仍旧无法创建商店,其他人点击也没有反应。 **Screenshots** ![2020-04-25_09 58 30](https://user-images.githubusercontent.com/59788198/80268763-7c9fdc80-86dc-11ea-85e4-bfcd86ffabe4.png) ![2020-04-25_10 03 25](https://user-images.githubusercontent.com/59788198/80268765-7d387300-86dc-11ea-9aa3-847fe399a549.png)
1.0
[BUG]大部分玩家无法创建商店 - **Describe the bug** 新版本的quickshop似乎存在residence权限的检测问题,有时,拥有该领地权限的玩家无法创建领地,提示:没有权限:第三方插件 [{0}]。 当我将integration.residence.create修改为true时,创建商店还是提示没有权限,并且,箱子上的悬浮物会出现,每次创建都会出现一个悬浮物,并且会叠加起来,但仍旧无法创建商店,其他人点击也没有反应。 **Screenshots** ![2020-04-25_09 58 30](https://user-images.githubusercontent.com/59788198/80268763-7c9fdc80-86dc-11ea-85e4-bfcd86ffabe4.png) ![2020-04-25_10 03 25](https://user-images.githubusercontent.com/59788198/80268765-7d387300-86dc-11ea-9aa3-847fe399a549.png)
process
大部分玩家无法创建商店 describe the bug 新版本的quickshop似乎存在residence权限的检测问题,有时,拥有该领地权限的玩家无法创建领地,提示:没有权限:第三方插件 。 当我将integration residence create修改为true时,创建商店还是提示没有权限,并且,箱子上的悬浮物会出现,每次创建都会出现一个悬浮物,并且会叠加起来,但仍旧无法创建商店,其他人点击也没有反应。 screenshots
1
480,762
13,866,433,989
IssuesEvent
2020-10-16 06:47:58
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
play.google.com - Unable to install Hangouts app from web page
browser-firefox-tablet engine-gecko ml-needsdiagnosis-false priority-critical severity-critical
<!-- @browser: Firefox Mobile (Tablet) 65.0 --> <!-- @ua_header: QwantMobile/3.0 (Android 9; Tablet; rv:67.0) Gecko/67.0 Firefox/65.0 QwantBrowser/67.0.4 --> <!-- @reported_with: mobile-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/59639 --> **URL**: https://play.google.com/store/apps/details?id=com.google.android.talk&referrer=https%3A%2F%2Fhangouts.google.com%2F%3Faction%3Dchat%26pn%3D06%2B66%2B65%2B86%2B4%26hl%3Dfr **Browser / Version**: Firefox Mobile (Tablet) 65.0 **Operating System**: Android **Tested Another Browser**: Yes Chrome **Problem type**: Video or audio doesn't play **Description**: The video or audio does not play **Steps to Reproduce**: <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2020/10/3a2630e7-d055-4ff1-80de-aa4dd4adbcbf.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20190619220335</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/10/20d8adf6-7f78-4bc6-8b59-aebfbb02107f) _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
play.google.com - Unable to install Hangouts app from web page - <!-- @browser: Firefox Mobile (Tablet) 65.0 --> <!-- @ua_header: QwantMobile/3.0 (Android 9; Tablet; rv:67.0) Gecko/67.0 Firefox/65.0 QwantBrowser/67.0.4 --> <!-- @reported_with: mobile-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/59639 --> **URL**: https://play.google.com/store/apps/details?id=com.google.android.talk&referrer=https%3A%2F%2Fhangouts.google.com%2F%3Faction%3Dchat%26pn%3D06%2B66%2B65%2B86%2B4%26hl%3Dfr **Browser / Version**: Firefox Mobile (Tablet) 65.0 **Operating System**: Android **Tested Another Browser**: Yes Chrome **Problem type**: Video or audio doesn't play **Description**: The video or audio does not play **Steps to Reproduce**: <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2020/10/3a2630e7-d055-4ff1-80de-aa4dd4adbcbf.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20190619220335</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/10/20d8adf6-7f78-4bc6-8b59-aebfbb02107f) _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
play google com unable to install hangouts app from web page url browser version firefox mobile tablet operating system android tested another browser yes chrome problem type video or audio doesn t play description the video or audio does not play steps to reproduce view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel default hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
0
176,706
21,435,761,374
IssuesEvent
2022-04-24 01:04:39
rsoreq/grafana
https://api.github.com/repos/rsoreq/grafana
closed
CVE-2021-33587 (High) detected in css-what-2.1.3.tgz - autoclosed
security vulnerability
## CVE-2021-33587 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>css-what-2.1.3.tgz</b></p></summary> <p>a CSS selector parser</p> <p>Library home page: <a href="https://registry.npmjs.org/css-what/-/css-what-2.1.3.tgz">https://registry.npmjs.org/css-what/-/css-what-2.1.3.tgz</a></p> <p>Path to dependency file: /yarn.lock</p> <p>Path to vulnerable library: /node_modules/css-what/package.json</p> <p> Dependency Hierarchy: - html-webpack-plugin-3.2.0.tgz (Root Library) - pretty-error-2.1.1.tgz - renderkid-2.0.3.tgz - css-select-1.2.0.tgz - :x: **css-what-2.1.3.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The css-what package 4.0.0 through 5.0.0 for Node.js does not ensure that attribute parsing has Linear Time Complexity relative to the size of the input. <p>Publish Date: 2021-05-28 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33587>CVE-2021-33587</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33587">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33587</a></p> <p>Release Date: 2021-05-28</p> <p>Fix Resolution (css-what): 5.0.1</p> <p>Direct dependency fix Resolution (html-webpack-plugin): 4.0.0-alpha</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"html-webpack-plugin","packageVersion":"3.2.0","packageFilePaths":["/yarn.lock"],"isTransitiveDependency":false,"dependencyTree":"html-webpack-plugin:3.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.0.0-alpha","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-33587","vulnerabilityDetails":"The css-what package 4.0.0 through 5.0.0 for Node.js does not ensure that attribute parsing has Linear Time Complexity relative to the size of the input.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33587","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-33587 (High) detected in css-what-2.1.3.tgz - autoclosed - ## CVE-2021-33587 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>css-what-2.1.3.tgz</b></p></summary> <p>a CSS selector parser</p> <p>Library home page: <a href="https://registry.npmjs.org/css-what/-/css-what-2.1.3.tgz">https://registry.npmjs.org/css-what/-/css-what-2.1.3.tgz</a></p> <p>Path to dependency file: /yarn.lock</p> <p>Path to vulnerable library: /node_modules/css-what/package.json</p> <p> Dependency Hierarchy: - html-webpack-plugin-3.2.0.tgz (Root Library) - pretty-error-2.1.1.tgz - renderkid-2.0.3.tgz - css-select-1.2.0.tgz - :x: **css-what-2.1.3.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The css-what package 4.0.0 through 5.0.0 for Node.js does not ensure that attribute parsing has Linear Time Complexity relative to the size of the input. <p>Publish Date: 2021-05-28 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33587>CVE-2021-33587</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33587">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33587</a></p> <p>Release Date: 2021-05-28</p> <p>Fix Resolution (css-what): 5.0.1</p> <p>Direct dependency fix Resolution (html-webpack-plugin): 4.0.0-alpha</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"html-webpack-plugin","packageVersion":"3.2.0","packageFilePaths":["/yarn.lock"],"isTransitiveDependency":false,"dependencyTree":"html-webpack-plugin:3.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.0.0-alpha","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-33587","vulnerabilityDetails":"The css-what package 4.0.0 through 5.0.0 for Node.js does not ensure that attribute parsing has Linear Time Complexity relative to the size of the input.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33587","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in css what tgz autoclosed cve high severity vulnerability vulnerable library css what tgz a css selector parser library home page a href path to dependency file yarn lock path to vulnerable library node modules css what package json dependency hierarchy html webpack plugin tgz root library pretty error tgz renderkid tgz css select tgz x css what tgz vulnerable library found in base branch master vulnerability details the css what package through for node js does not ensure that attribute parsing has linear time complexity relative to the size of the input publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution css what direct dependency fix resolution html webpack plugin alpha rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree html webpack plugin isminimumfixversionavailable true minimumfixversion alpha isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the css what package through for node js does not ensure that attribute parsing has linear time complexity relative to the size of the input vulnerabilityurl
0
106,309
9,126,122,165
IssuesEvent
2019-02-24 19:08:58
svigerske/Bt
https://api.github.com/repos/svigerske/Bt
closed
make sure third party projects can be disabled
bug configuration tests minor
Issue created by migration from Trac. Original creator: andreasw Original creation time: 2009-07-07 03:23:21 Assignee: andreasw Version: 0.5 For some of the third party projects, it does not work to specify --without-blabla to make sure that they are not compiled even when their source code is there. * metis: --without-metis crashes configure script * lapack: --without-lapack still compiles LAPACK code if it is there, but it is not linked into executable * mumps: --without-mumpslib ? * ASL: --without-asldir still compiles ASL, but does not link it into the code (e.g., Ipopt)
1.0
make sure third party projects can be disabled - Issue created by migration from Trac. Original creator: andreasw Original creation time: 2009-07-07 03:23:21 Assignee: andreasw Version: 0.5 For some of the third party projects, it does not work to specify --without-blabla to make sure that they are not compiled even when their source code is there. * metis: --without-metis crashes configure script * lapack: --without-lapack still compiles LAPACK code if it is there, but it is not linked into executable * mumps: --without-mumpslib ? * ASL: --without-asldir still compiles ASL, but does not link it into the code (e.g., Ipopt)
non_process
make sure third party projects can be disabled issue created by migration from trac original creator andreasw original creation time assignee andreasw version for some of the third party projects it does not work to specify without blabla to make sure that they are not compiled even when their source code is there metis without metis crashes configure script lapack without lapack still compiles lapack code if it is there but it is not linked into executable mumps without mumpslib asl without asldir still compiles asl but does not link it into the code e g ipopt
0
2,656
5,433,986,597
IssuesEvent
2017-03-05 01:20:11
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
System.Diagnostics.Process.Performance.Tests failed to start on CentOS in CI -- Linux kernel bug
area-System.Diagnostics.Process bug os-linux tracking-external-issue
From http://dotnet-ci.cloudapp.net/job/dotnet_corefx/job/master/job/centos7.1_release_prtest/397/ ``` /mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/Tools/Microsoft.CSharp.Core.targets(67,5): error MSB6006: "/mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/Tools/dotnetcli/dotnet" exited with code 139. [/mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/src/System.Diagnostics.Process/tests/Performance/System.Diagnostics.Process.Performance.Tests.csproj] ``` Oddly, there's nothing else in the log about this test. It doesn't seem to have even started.
1.0
System.Diagnostics.Process.Performance.Tests failed to start on CentOS in CI -- Linux kernel bug - From http://dotnet-ci.cloudapp.net/job/dotnet_corefx/job/master/job/centos7.1_release_prtest/397/ ``` /mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/Tools/Microsoft.CSharp.Core.targets(67,5): error MSB6006: "/mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/Tools/dotnetcli/dotnet" exited with code 139. [/mnt/resource/j/workspace/dotnet_corefx/master/centos7.1_release_prtest/src/System.Diagnostics.Process/tests/Performance/System.Diagnostics.Process.Performance.Tests.csproj] ``` Oddly, there's nothing else in the log about this test. It doesn't seem to have even started.
process
system diagnostics process performance tests failed to start on centos in ci linux kernel bug from mnt resource j workspace dotnet corefx master release prtest tools microsoft csharp core targets error mnt resource j workspace dotnet corefx master release prtest tools dotnetcli dotnet exited with code oddly there s nothing else in the log about this test it doesn t seem to have even started
1
14,684
17,798,392,479
IssuesEvent
2021-09-01 02:58:06
jim-king-2000/IndustryCamera
https://api.github.com/repos/jim-king-2000/IndustryCamera
closed
[bug]: web端的“客户端下载”的背景图未设置缓存
bug processing C
### 问题描述 <!-- 在这里描述您的问题 --> ### 您预期的行为 <!-- 系统应该表现出什么行为 --> ### 系统表现的行为 <!-- 系统实际表现出什么行为 --> ### 复现路径 <!-- 如何重现bug --> ### 辅助信息 - 浏览器版本:Edge/Chrome 92 - 固件版本:v1.0
1.0
[bug]: web端的“客户端下载”的背景图未设置缓存 - ### 问题描述 <!-- 在这里描述您的问题 --> ### 您预期的行为 <!-- 系统应该表现出什么行为 --> ### 系统表现的行为 <!-- 系统实际表现出什么行为 --> ### 复现路径 <!-- 如何重现bug --> ### 辅助信息 - 浏览器版本:Edge/Chrome 92 - 固件版本:v1.0
process
web端的“客户端下载”的背景图未设置缓存 问题描述 您预期的行为 系统表现的行为 复现路径 辅助信息 浏览器版本:edge chrome 固件版本:
1
20,027
26,510,252,810
IssuesEvent
2023-01-18 16:34:29
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
NTR: [proposed new term label] negative regulation of metaphase/anaphase transition of meiosis II
New term request cell cycle and DNA processes ready
Please provide as much information as you can: * **Suggested term label:** negative regulation of metaphase/anaphase transition of meiosis II * **Definition (free text)** standard of parents : regulation of metaphase/anaphase transition of meiosis II (GO:1905189) : negative regulation of metaphase/anaphase transition of meiotic cell cycle (GO:1902103) * **Reference, in format PMID:#######** https://www.pombase.org/gene/SPAC5D6.08c (will replace all existing annotations with a single specific term.) note to self, also connect MF-BP * **Gene product name and ID to be annotated to this term** for mes1
1.0
NTR: [proposed new term label] negative regulation of metaphase/anaphase transition of meiosis II - Please provide as much information as you can: * **Suggested term label:** negative regulation of metaphase/anaphase transition of meiosis II * **Definition (free text)** standard of parents : regulation of metaphase/anaphase transition of meiosis II (GO:1905189) : negative regulation of metaphase/anaphase transition of meiotic cell cycle (GO:1902103) * **Reference, in format PMID:#######** https://www.pombase.org/gene/SPAC5D6.08c (will replace all existing annotations with a single specific term.) note to self, also connect MF-BP * **Gene product name and ID to be annotated to this term** for mes1
process
ntr negative regulation of metaphase anaphase transition of meiosis ii please provide as much information as you can suggested term label negative regulation of metaphase anaphase transition of meiosis ii definition free text standard of parents regulation of metaphase anaphase transition of meiosis ii go negative regulation of metaphase anaphase transition of meiotic cell cycle go reference in format pmid will replace all existing annotations with a single specific term note to self also connect mf bp gene product name and id to be annotated to this term for
1
23,264
3,784,952,808
IssuesEvent
2016-03-20 05:50:12
recoilphp/recoil
https://api.github.com/repos/recoilphp/recoil
closed
Revisit behaviour of awaiting a strand.
defect status: in progress
As it stands, an assertion fails when attempting to await an already-exited strand. This makes it possible to use `yield $strand` as a "thread join" type operation, which is its main purpose. Unlike, a regular thread join, awaiting a strand forwards the return value / exception to the waiting strand, essentially making the calling strand its parent. There are several problems here: - You can't get the return value / exception if the strand has already exited - Multiple strands can not wait for the same strand This operation is a little different to `any()` and its kin, because the calling strand does not necessarily "own" the strand its waiting for. The following changers need to be made: - `yield $strand` should not receive the return value / exception at all. There's not much point to this anyway when you can simply call whatever coroutine the strand is executing directly. - Allow multiple strands to await the same strand. This is a little more difficult now that there can only be a single observer. We may need to reintroduce multiple observers, but keep a `primary` observer, or otherwise add direct support for 'joining' to `Strand`.
1.0
Revisit behaviour of awaiting a strand. - As it stands, an assertion fails when attempting to await an already-exited strand. This makes it possible to use `yield $strand` as a "thread join" type operation, which is its main purpose. Unlike, a regular thread join, awaiting a strand forwards the return value / exception to the waiting strand, essentially making the calling strand its parent. There are several problems here: - You can't get the return value / exception if the strand has already exited - Multiple strands can not wait for the same strand This operation is a little different to `any()` and its kin, because the calling strand does not necessarily "own" the strand its waiting for. The following changers need to be made: - `yield $strand` should not receive the return value / exception at all. There's not much point to this anyway when you can simply call whatever coroutine the strand is executing directly. - Allow multiple strands to await the same strand. This is a little more difficult now that there can only be a single observer. We may need to reintroduce multiple observers, but keep a `primary` observer, or otherwise add direct support for 'joining' to `Strand`.
non_process
revisit behaviour of awaiting a strand as it stands an assertion fails when attempting to await an already exited strand this makes it possible to use yield strand as a thread join type operation which is its main purpose unlike a regular thread join awaiting a strand forwards the return value exception to the waiting strand essentially making the calling strand its parent there are several problems here you can t get the return value exception if the strand has already exited multiple strands can not wait for the same strand this operation is a little different to any and its kin because the calling strand does not necessarily own the strand its waiting for the following changers need to be made yield strand should not receive the return value exception at all there s not much point to this anyway when you can simply call whatever coroutine the strand is executing directly allow multiple strands to await the same strand this is a little more difficult now that there can only be a single observer we may need to reintroduce multiple observers but keep a primary observer or otherwise add direct support for joining to strand
0
52,211
13,728,285,598
IssuesEvent
2020-10-04 10:54:47
tanmayc07/vue-calendar
https://api.github.com/repos/tanmayc07/vue-calendar
closed
CVE-2018-20190 (Medium) detected in opennmsopennms-source-25.1.0-1
security vulnerability
## CVE-2018-20190 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-25.1.0-1</b></p></summary> <p> <p>A Java based fault and performance management system</p> <p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p> <p>Found in HEAD commit: <a href="https://github.com/tanmayc07/vue-calendar/commit/09113fc2bcae4709407a251979464c2c24355395">09113fc2bcae4709407a251979464c2c24355395</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>vue-calendar/node_modules/node-sass/src/libsass/src/parser.cpp</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a NULL Pointer Dereference in the function Sass::Eval::operator()(Sass::Supports_Operator*) in eval.cpp may cause a Denial of Service (application crash) via a crafted sass input file. <p>Publish Date: 2018-12-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20190>CVE-2018-20190</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20190">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20190</a></p> <p>Release Date: 2018-12-17</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-20190 (Medium) detected in opennmsopennms-source-25.1.0-1 - ## CVE-2018-20190 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-25.1.0-1</b></p></summary> <p> <p>A Java based fault and performance management system</p> <p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p> <p>Found in HEAD commit: <a href="https://github.com/tanmayc07/vue-calendar/commit/09113fc2bcae4709407a251979464c2c24355395">09113fc2bcae4709407a251979464c2c24355395</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>vue-calendar/node_modules/node-sass/src/libsass/src/parser.cpp</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a NULL Pointer Dereference in the function Sass::Eval::operator()(Sass::Supports_Operator*) in eval.cpp may cause a Denial of Service (application crash) via a crafted sass input file. <p>Publish Date: 2018-12-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20190>CVE-2018-20190</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20190">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20190</a></p> <p>Release Date: 2018-12-17</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in opennmsopennms source cve medium severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href found in base branch master vulnerable source files vue calendar node modules node sass src libsass src parser cpp vulnerability details in libsass a null pointer dereference in the function sass eval operator sass supports operator in eval cpp may cause a denial of service application crash via a crafted sass input file publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource
0
20,343
26,999,590,911
IssuesEvent
2023-02-10 06:13:36
bobocode-blyznytsia/bring-framework
https://api.github.com/repos/bobocode-blyznytsia/bring-framework
closed
Implement AutowiredBeanPostProcessor
bean-post-processor
### Description The `BeanPostProcessor` abstraction is responsible for the construction and initialisation logic of `Bean`. Basic implementation includes 2 implementations: `RawBeanPostProcessor` and `AutowiredBeanPostProcessor`. ### Solution In context of this story the `AutowiredBeanPostProcessor` should be implemented. `AutowiredBeanPostProcessor` injects the dependencies, which bean requires. We suppose that fields of the bean are not final. The dependency fields will be `null` after the `RawBeanPostProcessor`. Please, check the details in the diagram _(provided in Notes section)_ ### DoD - [ ] `AutowiredBeanPostProcessor` injects the dependencies of the Bean - [ ] Unit tests are green - [ ] JavaDoc is provided ### Resources - Please see the drawing with related interface: [bring-framework.drawio](https://viewer.diagrams.net/?tags=%7B%7D&highlight=0000ff&edit=_blank&layers=1&nav=1#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy) - _If you want to correct the diagram, use the link:_ https://app.diagrams.net/#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy
1.0
Implement AutowiredBeanPostProcessor - ### Description The `BeanPostProcessor` abstraction is responsible for the construction and initialisation logic of `Bean`. Basic implementation includes 2 implementations: `RawBeanPostProcessor` and `AutowiredBeanPostProcessor`. ### Solution In context of this story the `AutowiredBeanPostProcessor` should be implemented. `AutowiredBeanPostProcessor` injects the dependencies, which bean requires. We suppose that fields of the bean are not final. The dependency fields will be `null` after the `RawBeanPostProcessor`. Please, check the details in the diagram _(provided in Notes section)_ ### DoD - [ ] `AutowiredBeanPostProcessor` injects the dependencies of the Bean - [ ] Unit tests are green - [ ] JavaDoc is provided ### Resources - Please see the drawing with related interface: [bring-framework.drawio](https://viewer.diagrams.net/?tags=%7B%7D&highlight=0000ff&edit=_blank&layers=1&nav=1#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy) - _If you want to correct the diagram, use the link:_ https://app.diagrams.net/#G1DO0TqjCtae64B741QGthiLZ3S2vjKLSy
process
implement autowiredbeanpostprocessor description the beanpostprocessor abstraction is responsible for the construction and initialisation logic of bean basic implementation includes implementations rawbeanpostprocessor and autowiredbeanpostprocessor solution in context of this story the autowiredbeanpostprocessor should be implemented autowiredbeanpostprocessor injects the dependencies which bean requires we suppose that fields of the bean are not final the dependency fields will be null after the rawbeanpostprocessor please check the details in the diagram provided in notes section dod autowiredbeanpostprocessor injects the dependencies of the bean unit tests are green javadoc is provided resources please see the drawing with related interface if you want to correct the diagram use the link
1
352,501
25,070,286,964
IssuesEvent
2022-11-07 11:34:34
PyFstat/PyFstat
https://api.github.com/repos/PyFstat/PyFstat
closed
Type annotations seem to be a bit incompatible with the return field
documentation
See sphinx docs for `logging` and `injection_parameters`.
1.0
Type annotations seem to be a bit incompatible with the return field - See sphinx docs for `logging` and `injection_parameters`.
non_process
type annotations seem to be a bit incompatible with the return field see sphinx docs for logging and injection parameters
0
33,741
16,095,353,239
IssuesEvent
2021-04-26 22:26:13
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
why my 'workers=8,use_multiprocessing=True' do not work while training?
comp:dist-strat type:performance
**Here is my code:** train_model_input=generate_arrays_from_dataframe( data.sample(frac=1) ) history = model.fit(train_model_input, epochs=1, verbose=1, validation_split=0.0,steps_per_epoch= math.ceil( train_row_len/256), workers=8,use_multiprocessing=True) I find it only 100% CPU-Util not the expect 800%. ![图片](https://user-images.githubusercontent.com/49393828/110285880-e4be8f00-801e-11eb-8e83-b26b4297729e.png) And my GPU-Util is 0% although GPU memory is fully used ![图片](https://user-images.githubusercontent.com/49393828/110285828-ce183800-801e-11eb-9605-1e4289ae39b1.png) How could I Increase the utilization of my CPU/GPU? thank you!
True
why my 'workers=8,use_multiprocessing=True' do not work while training? - **Here is my code:** train_model_input=generate_arrays_from_dataframe( data.sample(frac=1) ) history = model.fit(train_model_input, epochs=1, verbose=1, validation_split=0.0,steps_per_epoch= math.ceil( train_row_len/256), workers=8,use_multiprocessing=True) I find it only 100% CPU-Util not the expect 800%. ![图片](https://user-images.githubusercontent.com/49393828/110285880-e4be8f00-801e-11eb-8e83-b26b4297729e.png) And my GPU-Util is 0% although GPU memory is fully used ![图片](https://user-images.githubusercontent.com/49393828/110285828-ce183800-801e-11eb-9605-1e4289ae39b1.png) How could I Increase the utilization of my CPU/GPU? thank you!
non_process
why my workers use multiprocessing true do not work while training here is my code: train model input generate arrays from dataframe data sample frac history model fit train model input epochs verbose validation split steps per epoch math ceil train row len workers use multiprocessing true i find it only cpu util not the expect and my gpu util is although gpu memory is fully used how could i increase the utilization of my cpu gpu thank you
0
129,615
5,099,871,543
IssuesEvent
2017-01-04 10:00:21
hpi-swt2/workshop-portal
https://api.github.com/repos/hpi-swt2/workshop-portal
closed
US_2.1: Change my application
Medium Priority team-hendrik
**As** pupil **I want to** be able to process my application up to the deadline. After this, no change is possible. **in order to** change my application anytime **dependency:** #18 **estimate:** 3 **acceptance criteria:** - [x] edit form displays the saved data - [x] changes are saved in the database - [x] error-handling if mandatory fields are empty (like in #20 and #40) - [x] changes after deadline are not possible
1.0
US_2.1: Change my application - **As** pupil **I want to** be able to process my application up to the deadline. After this, no change is possible. **in order to** change my application anytime **dependency:** #18 **estimate:** 3 **acceptance criteria:** - [x] edit form displays the saved data - [x] changes are saved in the database - [x] error-handling if mandatory fields are empty (like in #20 and #40) - [x] changes after deadline are not possible
non_process
us change my application as pupil i want to be able to process my application up to the deadline after this no change is possible in order to change my application anytime dependency estimate acceptance criteria edit form displays the saved data changes are saved in the database error handling if mandatory fields are empty like in and changes after deadline are not possible
0
10,178
13,044,162,791
IssuesEvent
2020-07-29 03:47:36
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `TruncateDecimal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `TruncateDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `TruncateDecimal` from TiDB - ## Description Port the scalar function `TruncateDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function truncatedecimal from tidb description port the scalar function truncatedecimal from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
1
10,215
13,080,489,353
IssuesEvent
2020-08-01 07:37:59
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
[Processing] GDAL Assign Projection does not update QgsRasterLayer CRS
Bug Processing
**Describe the bug** Even if the projection is assign, the `QgsRasterLayer` and the `QgsRasterDataProvider` was not updated. In the case of a layer loaded in the project a fix has been provided https://github.com/qgis/QGIS/pull/37919 In the case of a layer provided by a model, the CRS is not updated even if the source has been updated. **How to Reproduce** First case 1. Load a raster layer without projection 2. Use the GDAL Assign Projection Algorithm to update the raster Layer 3. Go to layer properties 4. See that the CRS has not changed Second case 1. Load a raster layer layer with a USER CRS, for example +proj=merc +lon_0=0 +lat_ts=-46 +x_0=0 +y_0=0 +ellps=WGS84 +units=m +no_defs 2. Use a model to clip the raster by extent and assign the input layer project to the output layer [ifremer_clip_by_extent.model3](https://github.com/qgis/QGIS/files/4960459/ifremer_clip_by_extent.txt) 3. Go to OUTPUT layer properties 4. See that the CRS is not a USER one **QGIS and OS versions** QGIS 3.10 **Additional context** I have found a way to fix this issue if the layer is loaded in the project. I did not know how to update the QgsRasterLayer in a parent algorithm QgsProcessingContext. Herre is the model algorithm saved in Python: ```py def processAlgorithm(self, parameters, context, model_feedback): # Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the # overall progress through the model feedback = QgsProcessingMultiStepFeedback(2, model_feedback) results = {} outputs = {} # Clip raster by extent alg_params = { 'DATA_TYPE': 0, 'INPUT': parameters['input'], 'NODATA': None, 'OPTIONS': '', 'PROJWIN': parameters['extent'], 'OUTPUT': parameters['Output'] } outputs['ClipRasterByExtent'] = processing.run('gdal:cliprasterbyextent', alg_params, context=context, feedback=feedback, is_child_algorithm=True) results['Output'] = outputs['ClipRasterByExtent']['OUTPUT'] feedback.setCurrentStep(1) if feedback.isCanceled(): return {} # Assign projection alg_params = { 'CRS': parameters['input'], 'INPUT': outputs['ClipRasterByExtent']['OUTPUT'] } outputs['AssignProjection'] = processing.run('gdal:assignprojection', alg_params, context=context, feedback=feedback, is_child_algorithm=True) return results ``` The `outputs['ClipRasterByExtent']['OUTPUT']` is not updatable evenif it is the input of `gdal:assignprojection`.
1.0
[Processing] GDAL Assign Projection does not update QgsRasterLayer CRS - **Describe the bug** Even if the projection is assign, the `QgsRasterLayer` and the `QgsRasterDataProvider` was not updated. In the case of a layer loaded in the project a fix has been provided https://github.com/qgis/QGIS/pull/37919 In the case of a layer provided by a model, the CRS is not updated even if the source has been updated. **How to Reproduce** First case 1. Load a raster layer without projection 2. Use the GDAL Assign Projection Algorithm to update the raster Layer 3. Go to layer properties 4. See that the CRS has not changed Second case 1. Load a raster layer layer with a USER CRS, for example +proj=merc +lon_0=0 +lat_ts=-46 +x_0=0 +y_0=0 +ellps=WGS84 +units=m +no_defs 2. Use a model to clip the raster by extent and assign the input layer project to the output layer [ifremer_clip_by_extent.model3](https://github.com/qgis/QGIS/files/4960459/ifremer_clip_by_extent.txt) 3. Go to OUTPUT layer properties 4. See that the CRS is not a USER one **QGIS and OS versions** QGIS 3.10 **Additional context** I have found a way to fix this issue if the layer is loaded in the project. I did not know how to update the QgsRasterLayer in a parent algorithm QgsProcessingContext. Herre is the model algorithm saved in Python: ```py def processAlgorithm(self, parameters, context, model_feedback): # Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the # overall progress through the model feedback = QgsProcessingMultiStepFeedback(2, model_feedback) results = {} outputs = {} # Clip raster by extent alg_params = { 'DATA_TYPE': 0, 'INPUT': parameters['input'], 'NODATA': None, 'OPTIONS': '', 'PROJWIN': parameters['extent'], 'OUTPUT': parameters['Output'] } outputs['ClipRasterByExtent'] = processing.run('gdal:cliprasterbyextent', alg_params, context=context, feedback=feedback, is_child_algorithm=True) results['Output'] = outputs['ClipRasterByExtent']['OUTPUT'] feedback.setCurrentStep(1) if feedback.isCanceled(): return {} # Assign projection alg_params = { 'CRS': parameters['input'], 'INPUT': outputs['ClipRasterByExtent']['OUTPUT'] } outputs['AssignProjection'] = processing.run('gdal:assignprojection', alg_params, context=context, feedback=feedback, is_child_algorithm=True) return results ``` The `outputs['ClipRasterByExtent']['OUTPUT']` is not updatable evenif it is the input of `gdal:assignprojection`.
process
gdal assign projection does not update qgsrasterlayer crs describe the bug even if the projection is assign the qgsrasterlayer and the qgsrasterdataprovider was not updated in the case of a layer loaded in the project a fix has been provided in the case of a layer provided by a model the crs is not updated even if the source has been updated how to reproduce first case load a raster layer without projection use the gdal assign projection algorithm to update the raster layer go to layer properties see that the crs has not changed second case load a raster layer layer with a user crs for example proj merc lon lat ts x y ellps units m no defs use a model to clip the raster by extent and assign the input layer project to the output layer go to output layer properties see that the crs is not a user one qgis and os versions qgis additional context i have found a way to fix this issue if the layer is loaded in the project i did not know how to update the qgsrasterlayer in a parent algorithm qgsprocessingcontext herre is the model algorithm saved in python py def processalgorithm self parameters context model feedback use a multi step feedback so that individual child algorithm progress reports are adjusted for the overall progress through the model feedback qgsprocessingmultistepfeedback model feedback results outputs clip raster by extent alg params data type input parameters nodata none options projwin parameters output parameters outputs processing run gdal cliprasterbyextent alg params context context feedback feedback is child algorithm true results outputs feedback setcurrentstep if feedback iscanceled return assign projection alg params crs parameters input outputs outputs processing run gdal assignprojection alg params context context feedback feedback is child algorithm true return results the outputs is not updatable evenif it is the input of gdal assignprojection
1
2,354
5,164,510,503
IssuesEvent
2017-01-17 10:45:17
coala/teams
https://api.github.com/repos/coala/teams
closed
core Team Member Application: Lasse
process/approved
# Bio I'm Lasse. I'm the BLD of coala. I'm special. Also I'm western, white, male and in my 20ies so I fit the profile of an open source developer pretty well. # coala Contributions so far Let me see... I created it like thrice. The first version sucked so I had to make a second. It sucked as well so I made a third. It still sucks :/ # Road to the Future Helping @Makman2 and @Udayan12167 doing their stuff for the next gen bears and coala.io/cep5 of which I'm the author. I'll help reviewing, hopefully not too much organizing, and a tiny bit of coding where possible but no big core things in the near future.
1.0
core Team Member Application: Lasse - # Bio I'm Lasse. I'm the BLD of coala. I'm special. Also I'm western, white, male and in my 20ies so I fit the profile of an open source developer pretty well. # coala Contributions so far Let me see... I created it like thrice. The first version sucked so I had to make a second. It sucked as well so I made a third. It still sucks :/ # Road to the Future Helping @Makman2 and @Udayan12167 doing their stuff for the next gen bears and coala.io/cep5 of which I'm the author. I'll help reviewing, hopefully not too much organizing, and a tiny bit of coding where possible but no big core things in the near future.
process
core team member application lasse bio i m lasse i m the bld of coala i m special also i m western white male and in my so i fit the profile of an open source developer pretty well coala contributions so far let me see i created it like thrice the first version sucked so i had to make a second it sucked as well so i made a third it still sucks road to the future helping and doing their stuff for the next gen bears and coala io of which i m the author i ll help reviewing hopefully not too much organizing and a tiny bit of coding where possible but no big core things in the near future
1
13,785
16,543,114,552
IssuesEvent
2021-05-27 19:35:17
scieloorg/search-journals
https://api.github.com/repos/scieloorg/search-journals
closed
Renderizar HTML nos resumos
Processamento
Muitos resumos possuem códigos HTML que precisam ser renderizados. ![image](https://cloud.githubusercontent.com/assets/4059679/19572356/bdbc1886-96e0-11e6-8d3a-a3b8cdbd8609.png) Esse símbolo na verdade deveria ser <= (menor ou igual). http://homolog.search.scielo.org/?q=A+import%C3%A2ncia+da+an%C3%A1lise+de+especia%C3%A7%C3%A3o+do+chumbo+em+plasma+para+a+avalia%C3%A7%C3%A3o+dos+riscos+%C3%A0+sa%C3%BAde&lang=pt&page=1 Link do artigo original: http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0100-40422004000200015&lang=pt
1.0
Renderizar HTML nos resumos - Muitos resumos possuem códigos HTML que precisam ser renderizados. ![image](https://cloud.githubusercontent.com/assets/4059679/19572356/bdbc1886-96e0-11e6-8d3a-a3b8cdbd8609.png) Esse símbolo na verdade deveria ser <= (menor ou igual). http://homolog.search.scielo.org/?q=A+import%C3%A2ncia+da+an%C3%A1lise+de+especia%C3%A7%C3%A3o+do+chumbo+em+plasma+para+a+avalia%C3%A7%C3%A3o+dos+riscos+%C3%A0+sa%C3%BAde&lang=pt&page=1 Link do artigo original: http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0100-40422004000200015&lang=pt
process
renderizar html nos resumos muitos resumos possuem códigos html que precisam ser renderizados esse símbolo na verdade deveria ser menor ou igual link do artigo original
1
128,364
27,247,686,846
IssuesEvent
2023-02-22 04:24:27
wso2/ballerina-plugin-vscode
https://api.github.com/repos/wso2/ballerina-plugin-vscode
closed
[Record Editor] Failed to create separate record definitions from a give JSON
Type/Bug Priority/Highest Resolution/Fixed lowcode/component/record-editor
**Description:** $Subject ![record-editor-seperate-records](https://user-images.githubusercontent.com/26219651/216887008-8c116715-84a0-4f86-8d6d-cc260cecff32.gif) sample JSON: ```json { "id": "0001", "name": "Cake", "score": 0.55, "status": { "kind": "D" } } ``` **Affected Versions:** Ballerina 2201.3.2 Plugin 3.3.8
1.0
[Record Editor] Failed to create separate record definitions from a give JSON - **Description:** $Subject ![record-editor-seperate-records](https://user-images.githubusercontent.com/26219651/216887008-8c116715-84a0-4f86-8d6d-cc260cecff32.gif) sample JSON: ```json { "id": "0001", "name": "Cake", "score": 0.55, "status": { "kind": "D" } } ``` **Affected Versions:** Ballerina 2201.3.2 Plugin 3.3.8
non_process
failed to create separate record definitions from a give json description subject sample json json id name cake score status kind d affected versions ballerina plugin
0
157,706
19,981,583,564
IssuesEvent
2022-01-30 01:01:20
LancelotLiu/CAP4
https://api.github.com/repos/LancelotLiu/CAP4
opened
CVE-2019-3774 (High) detected in spring-batch-infrastructure-3.0.7.RELEASE.jar
security vulnerability
## CVE-2019-3774 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-batch-infrastructure-3.0.7.RELEASE.jar</b></p></summary> <p>Spring Batch Infrastructure</p> <p>Library home page: <a href="http://projects.spring.io/spring-batch/">http://projects.spring.io/spring-batch/</a></p> <p>Path to dependency file: /cap-batch/pom.xml</p> <p>Path to vulnerable library: /129160510_CTWWRC/downloadResource_LBGVMV/20220129160521/spring-batch-infrastructure-3.0.7.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-batch-infrastructure-3.0.7.RELEASE.jar** (Vulnerable Library) <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Spring Batch versions 3.0.9, 4.0.1, 4.1.0, and older unsupported versions, were susceptible to XML External Entity Injection (XXE) when receiving XML data from untrusted sources. <p>Publish Date: 2019-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-3774>CVE-2019-3774</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://pivotal.io/security/cve-2019-3774">https://pivotal.io/security/cve-2019-3774</a></p> <p>Release Date: 2020-06-29</p> <p>Fix Resolution: 3.0.10.RELEASE;4.0.2.RELEASE;4.1.1.RELEASE</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-3774 (High) detected in spring-batch-infrastructure-3.0.7.RELEASE.jar - ## CVE-2019-3774 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-batch-infrastructure-3.0.7.RELEASE.jar</b></p></summary> <p>Spring Batch Infrastructure</p> <p>Library home page: <a href="http://projects.spring.io/spring-batch/">http://projects.spring.io/spring-batch/</a></p> <p>Path to dependency file: /cap-batch/pom.xml</p> <p>Path to vulnerable library: /129160510_CTWWRC/downloadResource_LBGVMV/20220129160521/spring-batch-infrastructure-3.0.7.RELEASE.jar</p> <p> Dependency Hierarchy: - :x: **spring-batch-infrastructure-3.0.7.RELEASE.jar** (Vulnerable Library) <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Spring Batch versions 3.0.9, 4.0.1, 4.1.0, and older unsupported versions, were susceptible to XML External Entity Injection (XXE) when receiving XML data from untrusted sources. <p>Publish Date: 2019-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-3774>CVE-2019-3774</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://pivotal.io/security/cve-2019-3774">https://pivotal.io/security/cve-2019-3774</a></p> <p>Release Date: 2020-06-29</p> <p>Fix Resolution: 3.0.10.RELEASE;4.0.2.RELEASE;4.1.1.RELEASE</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in spring batch infrastructure release jar cve high severity vulnerability vulnerable library spring batch infrastructure release jar spring batch infrastructure library home page a href path to dependency file cap batch pom xml path to vulnerable library ctwwrc downloadresource lbgvmv spring batch infrastructure release jar dependency hierarchy x spring batch infrastructure release jar vulnerable library found in base branch develop vulnerability details spring batch versions and older unsupported versions were susceptible to xml external entity injection xxe when receiving xml data from untrusted sources publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution release release release step up your open source security game with whitesource
0
7,438
10,551,248,317
IssuesEvent
2019-10-03 12:58:17
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
closed
2.0 Release Checklist
Meta area: Process
Add items as required -------- - [x] Major enhancements: - [x] BLE Split Link Layer: #12681 - [x] Support for PPP protocol: #14034 - [x] Initial support for ARM Cortex-R: #9316 - [x] Initial support for TEE for ARC: #9341 - [x] A new DTS/binding parser: #17660 - [x] External modules into standalone repositories: #16454 - [x] ~~Make newlib libc the default c library: #3102 (Pushed to 2.1.0)~~ - [x] Pre RC1 Steps - [x] Verify that all external components and external dependencies are up to date, for example - mbedTLS - tinycrypt - FatFS - [x] Check known vulnerabilities, and fix exploitable vulnerabilities or verify them as un-exploitable. See https://github.com/zephyrproject-rtos/zephyr/wiki/Release-procedure for tagging details -------- - [x] Release Notes WIP - [x] Create draft doc/release-notes-2.0.rst with feature summary from https://www.zephyrproject.org/developers/#releases-overview - [x] Update doc/release-notes-2.0.rst detailed sections (relative to 1.14.0 release) as important issues are addressed (subsystem owners) -------- - [x] Release data - [x] Update version in VERSION - [x] Run sanitycheck with --all --release and checking resulting file scripts/sanity_chk/sanity_last_release.csv -------- - [x] Finalize Release Notes - [x] Add list of GitHub issues (spell check issue list and fix in GitHub as needed) - [x] Review doc/release-notes/release-notes-2.0.rst overview summary and details sections (see Sign-off below) - [x] Update doc/release-notes/index.rst (add reference to 2.0 notes) - [x] Update doc/index.rst (add reference to doc/2.0.0) - [x] Update doc/conf.py (add version to pick list) - [x] Update doc/LICENSING.rst for new components that do not have a license - [x] Create release and add notes on https://github.com/zephyrproject-rtos/zephyr/releases - [x] Final tagged release as zephyr-v2.0.0 (with release notes) - [x] Update https://www.zephyrproject.org/developers/#releases-overview - [x] Update https://www.zephyrproject.org/developers/#downloads (and git checkout version example) - [x] Update doc generation (in future should be automatic based on tag) ping @nashif - [x] Email devel@lists.zephyrproject.org & announce@lists.zephyrproject.org lists about release -------- - [x] Final Release Notes Sign-off - Subsystems - [x] Networking - @jukkar - [x] Bluetooth - @jhedberg , @carlescufi - Architectures - [x] ARC - @ruuddw - [x] ARM - @galak, @MaureenHelm - [x] X86 - @nashif, @andrewboie - [x] Xtensa, RISCV32, NIOS-II - @nashif, @andrewboie - [x] Security - @d3zd3z -------- - [x] After Release - [x] Update patchlevel in Makefile (PATCHLEVEL = 99) - [x] Marketing Blog for announcing release (https://www.zephyrproject.org/blog) - [x] Create branch for 2.0.x releases, update github protections for branch - [x] Create checklist issue for next release - [x] Add a comment to this issue linking to the next release checklist issue - [x] Update https://www.zephyrproject.org with latest stats from Github [@dbkinder] - [x] Move all PRs/Issues in the 2.0 milestone to the next milestone - [x] Close v2.0.0 milestone - [x] Create v2.0.1 milestone
1.0
2.0 Release Checklist - Add items as required -------- - [x] Major enhancements: - [x] BLE Split Link Layer: #12681 - [x] Support for PPP protocol: #14034 - [x] Initial support for ARM Cortex-R: #9316 - [x] Initial support for TEE for ARC: #9341 - [x] A new DTS/binding parser: #17660 - [x] External modules into standalone repositories: #16454 - [x] ~~Make newlib libc the default c library: #3102 (Pushed to 2.1.0)~~ - [x] Pre RC1 Steps - [x] Verify that all external components and external dependencies are up to date, for example - mbedTLS - tinycrypt - FatFS - [x] Check known vulnerabilities, and fix exploitable vulnerabilities or verify them as un-exploitable. See https://github.com/zephyrproject-rtos/zephyr/wiki/Release-procedure for tagging details -------- - [x] Release Notes WIP - [x] Create draft doc/release-notes-2.0.rst with feature summary from https://www.zephyrproject.org/developers/#releases-overview - [x] Update doc/release-notes-2.0.rst detailed sections (relative to 1.14.0 release) as important issues are addressed (subsystem owners) -------- - [x] Release data - [x] Update version in VERSION - [x] Run sanitycheck with --all --release and checking resulting file scripts/sanity_chk/sanity_last_release.csv -------- - [x] Finalize Release Notes - [x] Add list of GitHub issues (spell check issue list and fix in GitHub as needed) - [x] Review doc/release-notes/release-notes-2.0.rst overview summary and details sections (see Sign-off below) - [x] Update doc/release-notes/index.rst (add reference to 2.0 notes) - [x] Update doc/index.rst (add reference to doc/2.0.0) - [x] Update doc/conf.py (add version to pick list) - [x] Update doc/LICENSING.rst for new components that do not have a license - [x] Create release and add notes on https://github.com/zephyrproject-rtos/zephyr/releases - [x] Final tagged release as zephyr-v2.0.0 (with release notes) - [x] Update https://www.zephyrproject.org/developers/#releases-overview - [x] Update https://www.zephyrproject.org/developers/#downloads (and git checkout version example) - [x] Update doc generation (in future should be automatic based on tag) ping @nashif - [x] Email devel@lists.zephyrproject.org & announce@lists.zephyrproject.org lists about release -------- - [x] Final Release Notes Sign-off - Subsystems - [x] Networking - @jukkar - [x] Bluetooth - @jhedberg , @carlescufi - Architectures - [x] ARC - @ruuddw - [x] ARM - @galak, @MaureenHelm - [x] X86 - @nashif, @andrewboie - [x] Xtensa, RISCV32, NIOS-II - @nashif, @andrewboie - [x] Security - @d3zd3z -------- - [x] After Release - [x] Update patchlevel in Makefile (PATCHLEVEL = 99) - [x] Marketing Blog for announcing release (https://www.zephyrproject.org/blog) - [x] Create branch for 2.0.x releases, update github protections for branch - [x] Create checklist issue for next release - [x] Add a comment to this issue linking to the next release checklist issue - [x] Update https://www.zephyrproject.org with latest stats from Github [@dbkinder] - [x] Move all PRs/Issues in the 2.0 milestone to the next milestone - [x] Close v2.0.0 milestone - [x] Create v2.0.1 milestone
process
release checklist add items as required major enhancements ble split link layer support for ppp protocol initial support for arm cortex r initial support for tee for arc a new dts binding parser external modules into standalone repositories make newlib libc the default c library pushed to pre steps verify that all external components and external dependencies are up to date for example mbedtls tinycrypt fatfs check known vulnerabilities and fix exploitable vulnerabilities or verify them as un exploitable see for tagging details release notes wip create draft doc release notes rst with feature summary from update doc release notes rst detailed sections relative to release as important issues are addressed subsystem owners release data update version in version run sanitycheck with all release and checking resulting file scripts sanity chk sanity last release csv finalize release notes add list of github issues spell check issue list and fix in github as needed review doc release notes release notes rst overview summary and details sections see sign off below update doc release notes index rst add reference to notes update doc index rst add reference to doc update doc conf py add version to pick list update doc licensing rst for new components that do not have a license create release and add notes on final tagged release as zephyr with release notes update update and git checkout version example update doc generation in future should be automatic based on tag ping nashif email devel lists zephyrproject org announce lists zephyrproject org lists about release final release notes sign off subsystems networking jukkar bluetooth jhedberg carlescufi architectures arc ruuddw arm galak maureenhelm nashif andrewboie xtensa nios ii nashif andrewboie security after release update patchlevel in makefile patchlevel marketing blog for announcing release create branch for x releases update github protections for branch create checklist issue for next release add a comment to this issue linking to the next release checklist issue update with latest stats from github move all prs issues in the milestone to the next milestone close milestone create milestone
1
22,258
30,809,404,307
IssuesEvent
2023-08-01 09:23:01
apache/arrow-rs
https://api.github.com/repos/apache/arrow-rs
closed
`ListArray::try_new` disallows null elements if `field.is_nullable() == false`
bug development-process
**Describe the bug** Even if a field is non-nullable in its schema definition, it's only semantic and I think this should not [prevent](https://github.com/apache/arrow-rs/blob/414235e7630d05cccf0b9f5032ebfc0858b8ae5b/arrow-array/src/array/list_array.rs#L126-L132) the creation of arrays containing null values. Null values in semantically non-nullable fields may arise from, for example, being nested inside another nullable field (as in <https://github.com/apache/arrow-rs/issues/3900>) or a sparse union field. [Arrow Columnar Format](https://arrow.apache.org/docs/format/Columnar.html#schema-message): > The `Field` Flatbuffers type contains the metadata for a single array. This includes: \[...\] Whether the field is semantically nullable. While this has no bearing on the array’s physical layout, many systems distinguish nullable and non-nullable fields and we want to allow them to preserve this metadata to enable faithful schema round trips. **To Reproduce** ```rust let field = Arc::new(Field::new("element", DataType::Int32, false)); let offsets = OffsetBuffer::new(vec![0, 1, 4, 5].into()); let values = new_null_array(&DataType::Int32, 5); ListArray::new(field, offsets, values, None); ``` The above code panics with message ``called `Result::unwrap()` on an `Err` value: InvalidArgumentError("Non-nullable field of ListArray \"element\" cannot contain nulls")`` **Expected behavior** Not panicking **Additional context** <!-- Add any other context about the problem here. -->
1.0
`ListArray::try_new` disallows null elements if `field.is_nullable() == false` - **Describe the bug** Even if a field is non-nullable in its schema definition, it's only semantic and I think this should not [prevent](https://github.com/apache/arrow-rs/blob/414235e7630d05cccf0b9f5032ebfc0858b8ae5b/arrow-array/src/array/list_array.rs#L126-L132) the creation of arrays containing null values. Null values in semantically non-nullable fields may arise from, for example, being nested inside another nullable field (as in <https://github.com/apache/arrow-rs/issues/3900>) or a sparse union field. [Arrow Columnar Format](https://arrow.apache.org/docs/format/Columnar.html#schema-message): > The `Field` Flatbuffers type contains the metadata for a single array. This includes: \[...\] Whether the field is semantically nullable. While this has no bearing on the array’s physical layout, many systems distinguish nullable and non-nullable fields and we want to allow them to preserve this metadata to enable faithful schema round trips. **To Reproduce** ```rust let field = Arc::new(Field::new("element", DataType::Int32, false)); let offsets = OffsetBuffer::new(vec![0, 1, 4, 5].into()); let values = new_null_array(&DataType::Int32, 5); ListArray::new(field, offsets, values, None); ``` The above code panics with message ``called `Result::unwrap()` on an `Err` value: InvalidArgumentError("Non-nullable field of ListArray \"element\" cannot contain nulls")`` **Expected behavior** Not panicking **Additional context** <!-- Add any other context about the problem here. -->
process
listarray try new disallows null elements if field is nullable false describe the bug even if a field is non nullable in its schema definition it s only semantic and i think this should not the creation of arrays containing null values null values in semantically non nullable fields may arise from for example being nested inside another nullable field as in or a sparse union field the field flatbuffers type contains the metadata for a single array this includes whether the field is semantically nullable while this has no bearing on the array’s physical layout many systems distinguish nullable and non nullable fields and we want to allow them to preserve this metadata to enable faithful schema round trips to reproduce rust let field arc new field new element datatype false let offsets offsetbuffer new vec into let values new null array datatype listarray new field offsets values none the above code panics with message called result unwrap on an err value invalidargumenterror non nullable field of listarray element cannot contain nulls expected behavior not panicking additional context add any other context about the problem here
1
41,824
5,389,799,921
IssuesEvent
2017-02-25 06:57:48
brave/browser-laptop
https://api.github.com/repos/brave/browser-laptop
closed
add windows specific font CSS for URL text and tab text
design polish QA/steps-specified release-notes/include
I found some changes needed to improve legibility and sync the styles between mac and windows. With the changes below, we will have a match between OSs in the URL and Tab text. ![image](https://cloud.githubusercontent.com/assets/13509546/20289876/4a82ec22-aa91-11e6-989e-e224d6d5110d.png) (note: these are for WINDOWS CSS only) ## URL - increase font-weight to 500 - move up 1 px by reducing top margin ![image](https://cloud.githubusercontent.com/assets/13509546/20289686/0a6fe8fc-aa90-11e6-89ff-81588f08a790.png) **_update_**: this was fixed with https://github.com/brave/browser-laptop/pull/6848 ## tab font - change font weight to 500 - change font size to 12px - change height to 18px - change color to match URL color ![image](https://cloud.githubusercontent.com/assets/13509546/20289749/4da9be72-aa90-11e6-8948-7df86e9acf5c.png) ![image](https://cloud.githubusercontent.com/assets/13509546/20289763/6b373e6a-aa90-11e6-89e3-e756fcecc1cd.png) ## Test plan: ### URL bar 1. Open the browser on Windows 2. Make sure the URL bar looks like the mockup provided by @bradleyrichter 3. Open https://www.apple.com 4. Make sure the margin exists between the lock icon and the URL 5. Open http://apple.com (not HTTPS) 6. Make sure the same margin exists between the unlock icon and the URL 7. Input ":g" on the URL bar 8. Make sure the same margin exists between the Google icon and the search queries
1.0
add windows specific font CSS for URL text and tab text - I found some changes needed to improve legibility and sync the styles between mac and windows. With the changes below, we will have a match between OSs in the URL and Tab text. ![image](https://cloud.githubusercontent.com/assets/13509546/20289876/4a82ec22-aa91-11e6-989e-e224d6d5110d.png) (note: these are for WINDOWS CSS only) ## URL - increase font-weight to 500 - move up 1 px by reducing top margin ![image](https://cloud.githubusercontent.com/assets/13509546/20289686/0a6fe8fc-aa90-11e6-89ff-81588f08a790.png) **_update_**: this was fixed with https://github.com/brave/browser-laptop/pull/6848 ## tab font - change font weight to 500 - change font size to 12px - change height to 18px - change color to match URL color ![image](https://cloud.githubusercontent.com/assets/13509546/20289749/4da9be72-aa90-11e6-8948-7df86e9acf5c.png) ![image](https://cloud.githubusercontent.com/assets/13509546/20289763/6b373e6a-aa90-11e6-89e3-e756fcecc1cd.png) ## Test plan: ### URL bar 1. Open the browser on Windows 2. Make sure the URL bar looks like the mockup provided by @bradleyrichter 3. Open https://www.apple.com 4. Make sure the margin exists between the lock icon and the URL 5. Open http://apple.com (not HTTPS) 6. Make sure the same margin exists between the unlock icon and the URL 7. Input ":g" on the URL bar 8. Make sure the same margin exists between the Google icon and the search queries
non_process
add windows specific font css for url text and tab text i found some changes needed to improve legibility and sync the styles between mac and windows with the changes below we will have a match between oss in the url and tab text note these are for windows css only url increase font weight to move up px by reducing top margin update this was fixed with tab font change font weight to change font size to change height to change color to match url color test plan url bar open the browser on windows make sure the url bar looks like the mockup provided by bradleyrichter open make sure the margin exists between the lock icon and the url open not https make sure the same margin exists between the unlock icon and the url input g on the url bar make sure the same margin exists between the google icon and the search queries
0
10,825
13,609,547,535
IssuesEvent
2020-09-23 05:33:33
GoogleCloudPlatform/java-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/java-docs-samples
closed
Dependency Dashboard
samples type: process
This issue contains a list of Renovate updates and their statuses. ## Repository problems These problems occurred while renovating this repository. - WARN: Gradle extraction failed ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.apis-google-api-services-healthcare-1.x -->chore(deps): update dependency com.google.apis:google-api-services-healthcare to v1-rev20200909-1.30.10 - [ ] <!-- rebase-branch=renovate/com.google.apis-google-api-services-cloudresourcemanager-2.x -->chore(deps): update dependency com.google.apis:google-api-services-cloudresourcemanager to v2 - [ ] <!-- rebase-branch=renovate/com.googlecode.objectify-objectify-6.x -->chore(deps): update dependency com.googlecode.objectify:objectify to v6 - [ ] <!-- rebase-branch=renovate/javax.servlet-javax.servlet-api-4.x -->chore(deps): update dependency javax.servlet:javax.servlet-api to v4 - [ ] <!-- rebase-all-open-prs -->**Check this option to rebase all the above open PRs at once** --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses. ## Repository problems These problems occurred while renovating this repository. - WARN: Gradle extraction failed ## Open These updates have all been created already. Click a checkbox below to force a retry/rebase of any. - [ ] <!-- rebase-branch=renovate/com.google.apis-google-api-services-healthcare-1.x -->chore(deps): update dependency com.google.apis:google-api-services-healthcare to v1-rev20200909-1.30.10 - [ ] <!-- rebase-branch=renovate/com.google.apis-google-api-services-cloudresourcemanager-2.x -->chore(deps): update dependency com.google.apis:google-api-services-cloudresourcemanager to v2 - [ ] <!-- rebase-branch=renovate/com.googlecode.objectify-objectify-6.x -->chore(deps): update dependency com.googlecode.objectify:objectify to v6 - [ ] <!-- rebase-branch=renovate/javax.servlet-javax.servlet-api-4.x -->chore(deps): update dependency javax.servlet:javax.servlet-api to v4 - [ ] <!-- rebase-all-open-prs -->**Check this option to rebase all the above open PRs at once** --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue contains a list of renovate updates and their statuses repository problems these problems occurred while renovating this repository warn gradle extraction failed open these updates have all been created already click a checkbox below to force a retry rebase of any chore deps update dependency com google apis google api services healthcare to chore deps update dependency com google apis google api services cloudresourcemanager to chore deps update dependency com googlecode objectify objectify to chore deps update dependency javax servlet javax servlet api to check this option to rebase all the above open prs at once check this box to trigger a request for renovate to run again on this repository
1
22,133
30,679,264,109
IssuesEvent
2023-07-26 08:03:37
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Not able to import python 2 package in tar.gz format
automation/svc triaged awaiting-product-team-response assigned-to-author product-issue process-automation/subsvc Pri2 escalated-product-team
When I want to upload python 2 package in tar.gz format, and I browse to a file location on my computer, the file is grayed out and I am not able to select that file. The file is less than 100MB. The documentation clearly says that it should be possible to upload a tar.gz file format. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ba1e6e7b-0891-1ec9-8811-ee6d036104a0 * Version Independent ID: bfff0c73-1853-f501-49d0-7a2b2a56070b * Content: [Manage Python 2 packages in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/python-packages) * Content Source: [articles/automation/python-packages.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/python-packages.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @MGoedtel * Microsoft Alias: **magoedte**
1.0
Not able to import python 2 package in tar.gz format - When I want to upload python 2 package in tar.gz format, and I browse to a file location on my computer, the file is grayed out and I am not able to select that file. The file is less than 100MB. The documentation clearly says that it should be possible to upload a tar.gz file format. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ba1e6e7b-0891-1ec9-8811-ee6d036104a0 * Version Independent ID: bfff0c73-1853-f501-49d0-7a2b2a56070b * Content: [Manage Python 2 packages in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/python-packages) * Content Source: [articles/automation/python-packages.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/python-packages.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @MGoedtel * Microsoft Alias: **magoedte**
process
not able to import python package in tar gz format when i want to upload python package in tar gz format and i browse to a file location on my computer the file is grayed out and i am not able to select that file the file is less than the documentation clearly says that it should be possible to upload a tar gz file format document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login mgoedtel microsoft alias magoedte
1
2,272
5,103,581,222
IssuesEvent
2017-01-04 21:52:22
Jarvvski/CavTools
https://api.github.com/repos/Jarvvski/CavTools
closed
Enlistment Form Update
Process Flow Request
<h2>Problem</h2><br />Please Add [PC] next to battlefield 1<br /><hr><h2>Reason</h2><br />We had a few recruits want to join who only play on PS4.<br><br>-First Lieutenant Lombardi.M
1.0
Enlistment Form Update - <h2>Problem</h2><br />Please Add [PC] next to battlefield 1<br /><hr><h2>Reason</h2><br />We had a few recruits want to join who only play on PS4.<br><br>-First Lieutenant Lombardi.M
process
enlistment form update problem please add next to battlefield reason we had a few recruits want to join who only play on first lieutenant lombardi m
1
3,835
6,802,432,561
IssuesEvent
2017-11-02 20:11:19
gratipay/inside.gratipay.com
https://api.github.com/repos/gratipay/inside.gratipay.com
closed
Review and document backup policies
Governance & Process
Reticketed from #468. My current practice is to store the backups taken during payday on an external hard drive. What else do we have going on? What should we do?
1.0
Review and document backup policies - Reticketed from #468. My current practice is to store the backups taken during payday on an external hard drive. What else do we have going on? What should we do?
process
review and document backup policies reticketed from my current practice is to store the backups taken during payday on an external hard drive what else do we have going on what should we do
1
14,197
17,099,223,160
IssuesEvent
2021-07-09 08:51:37
prisma/prisma
https://api.github.com/repos/prisma/prisma
opened
Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type
bug/1-repro-available kind/bug process/candidate team/migrations
<!-- If required, please update the title to be clear and descriptive --> Command: `prisma migrate dev --name unique` Version: `2.26.0` Binary Version: `9b816b3aa13cc270074f172f30d6eda8a8ce867d` Report: https://prisma-errors.netlify.app/report/13408 OS: `x64 darwin 20.3.0` JS Stacktrace: ``` Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type Please create an issue with your `schema.prisma` at https://github.com/prisma/prisma/issues/new at ChildProcess.<anonymous> (node_modules/prisma/build/index.js:55980:23) at ChildProcess.emit (events.js:315:20) at ChildProcess.EventEmitter.emit (domain.js:486:12) at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12) ``` Rust Stacktrace: ``` [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type ```
1.0
Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type - <!-- If required, please update the title to be clear and descriptive --> Command: `prisma migrate dev --name unique` Version: `2.26.0` Binary Version: `9b816b3aa13cc270074f172f30d6eda8a8ce867d` Report: https://prisma-errors.netlify.app/report/13408 OS: `x64 darwin 20.3.0` JS Stacktrace: ``` Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type Please create an issue with your `schema.prisma` at https://github.com/prisma/prisma/issues/new at ChildProcess.<anonymous> (node_modules/prisma/build/index.js:55980:23) at ChildProcess.emit (events.js:315:20) at ChildProcess.EventEmitter.emit (domain.js:486:12) at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12) ``` Rust Stacktrace: ``` [migration-engine/connectors/sql-migration-connector/src/sql_schema_calculator/sql_schema_calculator_flavour.rs:34:9] internal error: entered unreachable code: unreachable enum_column_type ```
process
error error in migration engine reason internal error entered unreachable code unreachable enum column type command prisma migrate dev name unique version binary version report os darwin js stacktrace error error in migration engine reason internal error entered unreachable code unreachable enum column type please create an issue with your schema prisma at at childprocess node modules prisma build index js at childprocess emit events js at childprocess eventemitter emit domain js at process childprocess handle onexit internal child process js rust stacktrace internal error entered unreachable code unreachable enum column type
1
206,083
23,365,630,296
IssuesEvent
2022-08-10 15:08:18
pactflow/demo-consumer-node-docker
https://api.github.com/repos/pactflow/demo-consumer-node-docker
opened
CVE-2021-3807 (High) detected in multiple libraries
security vulnerability
## CVE-2021-3807 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b></p></summary> <p> <details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/strip-ansi/node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/ansi-regex/package.json,/node_modules/pretty-format/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - react-dev-utils-10.2.1.tgz - inquirer-7.0.4.tgz - strip-ansi-5.2.0.tgz - :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/string-length/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - jest-watch-typeahead-0.4.2.tgz - jest-watcher-24.9.0.tgz - string-length-2.0.0.tgz - strip-ansi-4.0.0.tgz - :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/strip-ansi/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - react-dev-utils-10.2.1.tgz - strip-ansi-6.0.0.tgz - :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/pactflow/demo-consumer-node-docker/commit/32a10c47288f401697886a0c3a66f27a194c5a60">32a10c47288f401697886a0c3a66f27a194c5a60</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ansi-regex is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p> <p>Release Date: 2021-09-17</p> <p>Fix Resolution (ansi-regex): 4.1.1</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.0</p><p>Fix Resolution (ansi-regex): 3.0.1</p> <p>Direct dependency fix Resolution (react-scripts): 3.4.4</p><p>Fix Resolution (ansi-regex): 5.0.1</p> <p>Direct dependency fix Resolution (react-scripts): 3.4.4</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2021-3807 (High) detected in multiple libraries - ## CVE-2021-3807 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b></p></summary> <p> <details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/strip-ansi/node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/ansi-regex/package.json,/node_modules/pretty-format/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - react-dev-utils-10.2.1.tgz - inquirer-7.0.4.tgz - strip-ansi-5.2.0.tgz - :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/string-length/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - jest-watch-typeahead-0.4.2.tgz - jest-watcher-24.9.0.tgz - string-length-2.0.0.tgz - strip-ansi-4.0.0.tgz - :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library) </details> <details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary> <p>Regular expression for matching ANSI escape codes</p> <p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/strip-ansi/node_modules/ansi-regex/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.3.tgz (Root Library) - react-dev-utils-10.2.1.tgz - strip-ansi-6.0.0.tgz - :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/pactflow/demo-consumer-node-docker/commit/32a10c47288f401697886a0c3a66f27a194c5a60">32a10c47288f401697886a0c3a66f27a194c5a60</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ansi-regex is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p> <p>Release Date: 2021-09-17</p> <p>Fix Resolution (ansi-regex): 4.1.1</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.0</p><p>Fix Resolution (ansi-regex): 3.0.1</p> <p>Direct dependency fix Resolution (react-scripts): 3.4.4</p><p>Fix Resolution (ansi-regex): 5.0.1</p> <p>Direct dependency fix Resolution (react-scripts): 3.4.4</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_process
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules strip ansi node modules ansi regex package json node modules react dev utils node modules ansi regex package json node modules pretty format node modules ansi regex package json dependency hierarchy react scripts tgz root library react dev utils tgz inquirer tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules string length node modules ansi regex package json dependency hierarchy react scripts tgz root library jest watch typeahead tgz jest watcher tgz string length tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules ansi regex package json node modules react dev utils node modules strip ansi node modules ansi regex package json dependency hierarchy react scripts tgz root library react dev utils tgz strip ansi tgz x ansi regex tgz vulnerable library found in head commit a href found in base branch master vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex direct dependency fix resolution react scripts fix resolution ansi regex direct dependency fix resolution react scripts fix resolution ansi regex direct dependency fix resolution react scripts check this box to open an automated fix pr
0
186,177
6,734,150,023
IssuesEvent
2017-10-18 17:01:18
geosolutions-it/unesco-ihp
https://api.github.com/repos/geosolutions-it/unesco-ihp
closed
Final check on published Layers
blocked enhancement Priority: Medium question unesco-ihp
"UNESCO-GeoNode-Refactor-2017-Timeline" 'PHASE I'!N11 I changed my email address to my private one (we thought that maybe UNESCO system is filtering notifications email as spamming) and it works! However, for some reasons I received an email saying that "a layer has been updated" when it was actually uploaded, and I received this email 4 times (I received the 4 exact same notifications at the exact same time). This leads me to the following question: now that I am advised when a layer is pending approval, how do I do to make it public on WINS once I have approved it? Other questions: Are all super users able to receive notifications regarding the upload of a new layer/doc that is pending approval? Or is it linked to an other type of status (somewhere in the system, it says that Alessio and I are the ones allowed to be notified when new layers/doc need approval?) All users have access to a notification box where they can choose which item they want to be notified of. Are they only notified of things related to their own layers/maps/docs? For example, when the notification field "a rating was given to a layer" is ticked, is the user only notified when one of his layers/docs/maps is rated? We tested the notifications with a simple user account (gabin.archambault) and a non-unesco email address. He uploaded a layer, that I (chloe.meyer) rated while youssef.filali-meeknassi added a comment. Although all of Gabin's notifications are ticked in the box, he did not recceive any emails advising him that his layer had been rated/commented. Is it normal? On my side, I receieved all the notifications related to these tests we runed, stating that a layer had been rated/updloaded/commented on/etc.
1.0
Final check on published Layers - "UNESCO-GeoNode-Refactor-2017-Timeline" 'PHASE I'!N11 I changed my email address to my private one (we thought that maybe UNESCO system is filtering notifications email as spamming) and it works! However, for some reasons I received an email saying that "a layer has been updated" when it was actually uploaded, and I received this email 4 times (I received the 4 exact same notifications at the exact same time). This leads me to the following question: now that I am advised when a layer is pending approval, how do I do to make it public on WINS once I have approved it? Other questions: Are all super users able to receive notifications regarding the upload of a new layer/doc that is pending approval? Or is it linked to an other type of status (somewhere in the system, it says that Alessio and I are the ones allowed to be notified when new layers/doc need approval?) All users have access to a notification box where they can choose which item they want to be notified of. Are they only notified of things related to their own layers/maps/docs? For example, when the notification field "a rating was given to a layer" is ticked, is the user only notified when one of his layers/docs/maps is rated? We tested the notifications with a simple user account (gabin.archambault) and a non-unesco email address. He uploaded a layer, that I (chloe.meyer) rated while youssef.filali-meeknassi added a comment. Although all of Gabin's notifications are ticked in the box, he did not recceive any emails advising him that his layer had been rated/commented. Is it normal? On my side, I receieved all the notifications related to these tests we runed, stating that a layer had been rated/updloaded/commented on/etc.
non_process
final check on published layers unesco geonode refactor timeline phase i i changed my email address to my private one we thought that maybe unesco system is filtering notifications email as spamming and it works however for some reasons i received an email saying that a layer has been updated when it was actually uploaded and i received this email times i received the exact same notifications at the exact same time this leads me to the following question now that i am advised when a layer is pending approval how do i do to make it public on wins once i have approved it other questions are all super users able to receive notifications regarding the upload of a new layer doc that is pending approval or is it linked to an other type of status somewhere in the system it says that alessio and i are the ones allowed to be notified when new layers doc need approval all users have access to a notification box where they can choose which item they want to be notified of are they only notified of things related to their own layers maps docs for example when the notification field a rating was given to a layer is ticked is the user only notified when one of his layers docs maps is rated we tested the notifications with a simple user account gabin archambault and a non unesco email address he uploaded a layer that i chloe meyer rated while youssef filali meeknassi added a comment although all of gabin s notifications are ticked in the box he did not recceive any emails advising him that his layer had been rated commented is it normal on my side i receieved all the notifications related to these tests we runed stating that a layer had been rated updloaded commented on etc
0
8,368
11,519,598,790
IssuesEvent
2020-02-14 13:10:08
symfony/symfony
https://api.github.com/repos/symfony/symfony
closed
get starttime of process
Feature Process
It would be nice to have a getter for this property https://github.com/symfony/process/blob/master/Process.php#L58 this can help to calculate an ETA
1.0
get starttime of process - It would be nice to have a getter for this property https://github.com/symfony/process/blob/master/Process.php#L58 this can help to calculate an ETA
process
get starttime of process it would be nice to have a getter for this property this can help to calculate an eta
1
3,706
6,731,424,596
IssuesEvent
2017-10-18 07:33:27
nlbdev/pipeline
https://api.github.com/repos/nlbdev/pipeline
closed
braille CSS: "NLB", "Norge 2016", placement / plassering
enhancement pre-processing Priority:2 - Medium
(norwegian) *from Trello-board* NLB-CSS: title-page: De to linjene «NLB» og «Norge 2016» - skal stå uten blank linje mellom (hvis det er vanskelig, kan de stå på samme linje: «NLB, Norge 2016»)
1.0
braille CSS: "NLB", "Norge 2016", placement / plassering - (norwegian) *from Trello-board* NLB-CSS: title-page: De to linjene «NLB» og «Norge 2016» - skal stå uten blank linje mellom (hvis det er vanskelig, kan de stå på samme linje: «NLB, Norge 2016»)
process
braille css nlb norge placement plassering norwegian from trello board nlb css title page de to linjene «nlb» og «norge » skal stå uten blank linje mellom hvis det er vanskelig kan de stå på samme linje «nlb norge »
1
112,905
4,540,335,280
IssuesEvent
2016-09-09 14:21:50
Signbank/NGT-signbank
https://api.github.com/repos/Signbank/NGT-signbank
closed
Create an easy way to hide the search fields
enhancement top priority
A simple button or small triangle at the top left that would hide all of the search page above the search results would be very useful, in order to focus on the list of hits, while still having access to the menu bar and the search boxes there.
1.0
Create an easy way to hide the search fields - A simple button or small triangle at the top left that would hide all of the search page above the search results would be very useful, in order to focus on the list of hits, while still having access to the menu bar and the search boxes there.
non_process
create an easy way to hide the search fields a simple button or small triangle at the top left that would hide all of the search page above the search results would be very useful in order to focus on the list of hits while still having access to the menu bar and the search boxes there
0
19,234
25,387,016,200
IssuesEvent
2022-11-21 22:55:44
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
[processor/transform] Add `drop` action support
enhancement priority:p2 processor/transform
`drop` action is pretty important functionality to replace filter processors. It's already being mentioned in the docs but not implemented yet. Subtasks per data type: - [ ] #13579 - [ ] #13580 - [ ] #13581
1.0
[processor/transform] Add `drop` action support - `drop` action is pretty important functionality to replace filter processors. It's already being mentioned in the docs but not implemented yet. Subtasks per data type: - [ ] #13579 - [ ] #13580 - [ ] #13581
process
add drop action support drop action is pretty important functionality to replace filter processors it s already being mentioned in the docs but not implemented yet subtasks per data type
1
54,963
6,885,643,076
IssuesEvent
2017-11-21 16:42:05
syndesisio/syndesis
https://api.github.com/repos/syndesisio/syndesis
opened
Use Patternfly for form elements
cat/design module/ui module/uxd
Reference: See Forms and Controls section: http://www.patternfly.org/pattern-library/ @seanforyou23 Added this issue to document and track your work, update as needed. Thanks.
1.0
Use Patternfly for form elements - Reference: See Forms and Controls section: http://www.patternfly.org/pattern-library/ @seanforyou23 Added this issue to document and track your work, update as needed. Thanks.
non_process
use patternfly for form elements reference see forms and controls section added this issue to document and track your work update as needed thanks
0
175,607
6,552,522,886
IssuesEvent
2017-09-05 18:41:40
SacredDuckwhale/TotalAP
https://api.github.com/repos/SacredDuckwhale/TotalAP
closed
Scanning of AP items is broken for ruRU clients (?)
info:help-needed module:core priority:high status:needs-confirmation type:bug
Can't confirm this because of the client-side restrictions when using the Russian client, but from a comment on Curse it appears that the scanner errors out. If someone can provide the correct pattern (format) that is in use currently the scanner could be updated accordingly and the validity can be checked via luaunit, wowhead doesn't seem to have the correct format, unfortunately, or maybe I made a mistake somewhere along the way in copying their format and comparing it to some test output displayed ingame from the item links they provided (which only worked for lower AP values, anyway).
1.0
Scanning of AP items is broken for ruRU clients (?) - Can't confirm this because of the client-side restrictions when using the Russian client, but from a comment on Curse it appears that the scanner errors out. If someone can provide the correct pattern (format) that is in use currently the scanner could be updated accordingly and the validity can be checked via luaunit, wowhead doesn't seem to have the correct format, unfortunately, or maybe I made a mistake somewhere along the way in copying their format and comparing it to some test output displayed ingame from the item links they provided (which only worked for lower AP values, anyway).
non_process
scanning of ap items is broken for ruru clients can t confirm this because of the client side restrictions when using the russian client but from a comment on curse it appears that the scanner errors out if someone can provide the correct pattern format that is in use currently the scanner could be updated accordingly and the validity can be checked via luaunit wowhead doesn t seem to have the correct format unfortunately or maybe i made a mistake somewhere along the way in copying their format and comparing it to some test output displayed ingame from the item links they provided which only worked for lower ap values anyway
0
272,059
20,731,556,250
IssuesEvent
2022-03-14 09:55:12
root-project/root
https://api.github.com/repos/root-project/root
closed
[doxy] upgrade to Mathjax3
improvement in:Documentation
### Explain what you would like to see improved Mathjax2 is used by documentation. ### Optional: share how it could be improved Mathjax3 brings some performance improvements. Maybe it can be even downloaded on the fly, rather than storing it as part of the ROOT source code? ### To Reproduce https://root.cern.ch/doc/master/ ### Setup ``` ------------------------------------------------------------------ | Welcome to ROOT 6.27/01 https://root.cern | | (c) 1995-2021, The ROOT Team; conception: R. Brun, F. Rademakers | | Built for linuxx8664gcc on Jan 12 2022, 10:17:19 | | From heads/master@v6-25-01-2870-gdac9b6398d | | With c++ (Ubuntu 8.4.0-1ubuntu1~18.04) 8.4.0 | | Try '.help', '.demo', '.license', '.credits', '.quit'/'.q' | ------------------------------------------------------------------ ``` ### Additional context https://github.com/doxygen/doxygen/issues/9185#issuecomment-1059971803
1.0
[doxy] upgrade to Mathjax3 - ### Explain what you would like to see improved Mathjax2 is used by documentation. ### Optional: share how it could be improved Mathjax3 brings some performance improvements. Maybe it can be even downloaded on the fly, rather than storing it as part of the ROOT source code? ### To Reproduce https://root.cern.ch/doc/master/ ### Setup ``` ------------------------------------------------------------------ | Welcome to ROOT 6.27/01 https://root.cern | | (c) 1995-2021, The ROOT Team; conception: R. Brun, F. Rademakers | | Built for linuxx8664gcc on Jan 12 2022, 10:17:19 | | From heads/master@v6-25-01-2870-gdac9b6398d | | With c++ (Ubuntu 8.4.0-1ubuntu1~18.04) 8.4.0 | | Try '.help', '.demo', '.license', '.credits', '.quit'/'.q' | ------------------------------------------------------------------ ``` ### Additional context https://github.com/doxygen/doxygen/issues/9185#issuecomment-1059971803
non_process
upgrade to explain what you would like to see improved is used by documentation optional share how it could be improved brings some performance improvements maybe it can be even downloaded on the fly rather than storing it as part of the root source code to reproduce setup welcome to root c the root team conception r brun f rademakers built for on jan from heads master with c ubuntu try help demo license credits quit q additional context
0
262,768
19,830,882,299
IssuesEvent
2022-01-20 11:51:42
ngneat/falso
https://api.github.com/repos/ngneat/falso
closed
Interactive Playground within the Docs
documentation enhancement
### Description https://dev.to/mrmuhammadali/live-code-editing-in-docusaurus-ux-at-its-best-2hj1 ### Proposed solution _No response_ ### Alternatives considered _No response_ ### Do you want to create a pull request? No
1.0
Interactive Playground within the Docs - ### Description https://dev.to/mrmuhammadali/live-code-editing-in-docusaurus-ux-at-its-best-2hj1 ### Proposed solution _No response_ ### Alternatives considered _No response_ ### Do you want to create a pull request? No
non_process
interactive playground within the docs description proposed solution no response alternatives considered no response do you want to create a pull request no
0
18,516
24,551,723,146
IssuesEvent
2022-10-12 13:06:42
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
Display offline indicator for mobile apps
P2 iOS Android Process: Fixed Process: Tested QA Process: Tested dev Process: Enhancement
The mobile app must show a standard offline indicator when the user is not connected. Actions on the screen that cannot be used in such situations must get disabled (e.g. Sign up, profile updates, enrollment). Sections that support offline capability must continue to be enabled for use. Message to be shown on screens that don't have offline capability - You are offline Message to be shown on screens that have offline capability - You are offline and may be missing out on content updates
4.0
Display offline indicator for mobile apps - The mobile app must show a standard offline indicator when the user is not connected. Actions on the screen that cannot be used in such situations must get disabled (e.g. Sign up, profile updates, enrollment). Sections that support offline capability must continue to be enabled for use. Message to be shown on screens that don't have offline capability - You are offline Message to be shown on screens that have offline capability - You are offline and may be missing out on content updates
process
display offline indicator for mobile apps the mobile app must show a standard offline indicator when the user is not connected actions on the screen that cannot be used in such situations must get disabled e g sign up profile updates enrollment sections that support offline capability must continue to be enabled for use message to be shown on screens that don t have offline capability you are offline message to be shown on screens that have offline capability you are offline and may be missing out on content updates
1
163,245
20,324,671,755
IssuesEvent
2022-02-18 03:51:37
TreyM-WSS/terra-clinical
https://api.github.com/repos/TreyM-WSS/terra-clinical
opened
CVE-2020-7789 (Medium) detected in node-notifier-6.0.0.tgz
security vulnerability
## CVE-2020-7789 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-notifier-6.0.0.tgz</b></p></summary> <p>A Node.js module for sending notifications on native Mac, Windows (post and pre 8) and Linux (or Growl as fallback)</p> <p>Library home page: <a href="https://registry.npmjs.org/node-notifier/-/node-notifier-6.0.0.tgz">https://registry.npmjs.org/node-notifier/-/node-notifier-6.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/terra-toolkit/node_modules/node-notifier/package.json</p> <p> Dependency Hierarchy: - terra-toolkit-6.16.0.tgz (Root Library) - reporters-25.5.1.tgz - :x: **node-notifier-6.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/TreyM-WSS/terra-clinical/commit/696724f55dd1815ae248f4f5b6611cb6c5c68606">696724f55dd1815ae248f4f5b6611cb6c5c68606</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package node-notifier before 9.0.0. It allows an attacker to run arbitrary commands on Linux machines due to the options params not being sanitised when being passed an array. <p>Publish Date: 2020-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7789>CVE-2020-7789</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7789">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7789</a></p> <p>Release Date: 2020-12-11</p> <p>Fix Resolution: 9.0.0</p> </p> </details> <p></p>
True
CVE-2020-7789 (Medium) detected in node-notifier-6.0.0.tgz - ## CVE-2020-7789 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-notifier-6.0.0.tgz</b></p></summary> <p>A Node.js module for sending notifications on native Mac, Windows (post and pre 8) and Linux (or Growl as fallback)</p> <p>Library home page: <a href="https://registry.npmjs.org/node-notifier/-/node-notifier-6.0.0.tgz">https://registry.npmjs.org/node-notifier/-/node-notifier-6.0.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/terra-toolkit/node_modules/node-notifier/package.json</p> <p> Dependency Hierarchy: - terra-toolkit-6.16.0.tgz (Root Library) - reporters-25.5.1.tgz - :x: **node-notifier-6.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/TreyM-WSS/terra-clinical/commit/696724f55dd1815ae248f4f5b6611cb6c5c68606">696724f55dd1815ae248f4f5b6611cb6c5c68606</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package node-notifier before 9.0.0. It allows an attacker to run arbitrary commands on Linux machines due to the options params not being sanitised when being passed an array. <p>Publish Date: 2020-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7789>CVE-2020-7789</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7789">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7789</a></p> <p>Release Date: 2020-12-11</p> <p>Fix Resolution: 9.0.0</p> </p> </details> <p></p>
non_process
cve medium detected in node notifier tgz cve medium severity vulnerability vulnerable library node notifier tgz a node js module for sending notifications on native mac windows post and pre and linux or growl as fallback library home page a href path to dependency file package json path to vulnerable library node modules terra toolkit node modules node notifier package json dependency hierarchy terra toolkit tgz root library reporters tgz x node notifier tgz vulnerable library found in head commit a href vulnerability details this affects the package node notifier before it allows an attacker to run arbitrary commands on linux machines due to the options params not being sanitised when being passed an array publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
12,516
9,811,120,716
IssuesEvent
2019-06-12 22:26:02
dotnet/core-setup
https://api.github.com/repos/dotnet/core-setup
closed
ToolsError build failures in 'prodcon/cli/release/2.2.1xx/' - '20190214.01'
area-Host area-Infrastructure
@robertborr commented on [Tue Feb 19 2019](https://github.com/dotnet/cli/issues/10832) @dotnet-mc-bot commented on [Fri Feb 15 2019](https://github.com/dotnet/core-eng/issues/5249) There were a set of failures during this build. Here is a summary of these: * https://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=2420496 - **Agent:** DDVSOMACAGE009 - **Error log:** Total tests: 2. Passed: 0. Failed: 0. Skipped: 2.... @robertborr --- @robertborr commented on [Tue Feb 19 2019](https://github.com/dotnet/core-eng/issues/5249#issuecomment-465236089) 15>RunTest: Executing - /Users/buildagent/agent/_work/455/s/bin/2/osx-x64/dotnet/dotnet restore --disable-parallel - in pwd /Users/buildagent/agent/_work/455/s/bin/3/osx-x64/test/dotnet-list-package.Tests/ItDoesNotAcceptInvalidFramework/MSBuildAppWithMultipleFrameworks 15>RunTest: VSTest: Exit code: 0 15>RunTest: [xUnit.net 00:01:15.1643891] Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored [FAIL] Failed Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored 15>DOTNETTEST : error Message: [/Users/buildagent/agent/_work/455/s/build/test/RunTest.proj] Expected command to pass but it did not. File Name: /Users/buildagent/agent/_work/455/s/bin/2/osx-x64/dotnet/dotnet Arguments: list /Users/buildagent/agent/_work/455/s/bin/3/osx-x64/test/dotnet-list-package.Tests/AssetsPathExistsButNotRestored/NewtonSoftDependentProject package WorkingDir:: Exit Code: 139 StdOut: StdErr: Stack Trace: at FluentAssertions.Execution.XUnit2TestFramework.Throw(String message) at FluentAssertions.Execution.AssertionScope.FailWith(String message, Object[] args) at Microsoft.DotNet.Tools.Test.Utilities.CommandResultAssertions.Pass() in /Users/buildagent/agent/_work/455/s/test/Microsoft.DotNet.Tools.Tests.Utilities/Assertions/CommandResultAssertions.cs:line 32 at Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored() in /Users/buildagent/agent/_work/455/s/test/dotnet-list-package.Tests/GivenDotnetListPackage.cs:line 120 Testing on the build machine is advised against, this test had a segmentation fault and should be investigated by the test owner. --- @livarcocc commented on [Tue Feb 19 2019](https://github.com/dotnet/cli/issues/10832#issuecomment-465268656) This is a seg fault. The CLI is purelly managed. So, this must be the host crashing. Please move the issue to core-setup.
1.0
ToolsError build failures in 'prodcon/cli/release/2.2.1xx/' - '20190214.01' - @robertborr commented on [Tue Feb 19 2019](https://github.com/dotnet/cli/issues/10832) @dotnet-mc-bot commented on [Fri Feb 15 2019](https://github.com/dotnet/core-eng/issues/5249) There were a set of failures during this build. Here is a summary of these: * https://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build?_a=summary&buildId=2420496 - **Agent:** DDVSOMACAGE009 - **Error log:** Total tests: 2. Passed: 0. Failed: 0. Skipped: 2.... @robertborr --- @robertborr commented on [Tue Feb 19 2019](https://github.com/dotnet/core-eng/issues/5249#issuecomment-465236089) 15>RunTest: Executing - /Users/buildagent/agent/_work/455/s/bin/2/osx-x64/dotnet/dotnet restore --disable-parallel - in pwd /Users/buildagent/agent/_work/455/s/bin/3/osx-x64/test/dotnet-list-package.Tests/ItDoesNotAcceptInvalidFramework/MSBuildAppWithMultipleFrameworks 15>RunTest: VSTest: Exit code: 0 15>RunTest: [xUnit.net 00:01:15.1643891] Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored [FAIL] Failed Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored 15>DOTNETTEST : error Message: [/Users/buildagent/agent/_work/455/s/build/test/RunTest.proj] Expected command to pass but it did not. File Name: /Users/buildagent/agent/_work/455/s/bin/2/osx-x64/dotnet/dotnet Arguments: list /Users/buildagent/agent/_work/455/s/bin/3/osx-x64/test/dotnet-list-package.Tests/AssetsPathExistsButNotRestored/NewtonSoftDependentProject package WorkingDir:: Exit Code: 139 StdOut: StdErr: Stack Trace: at FluentAssertions.Execution.XUnit2TestFramework.Throw(String message) at FluentAssertions.Execution.AssertionScope.FailWith(String message, Object[] args) at Microsoft.DotNet.Tools.Test.Utilities.CommandResultAssertions.Pass() in /Users/buildagent/agent/_work/455/s/test/Microsoft.DotNet.Tools.Tests.Utilities/Assertions/CommandResultAssertions.cs:line 32 at Microsoft.DotNet.Cli.List.Package.Tests.GivenDotnetListPackage.AssetsPathExistsButNotRestored() in /Users/buildagent/agent/_work/455/s/test/dotnet-list-package.Tests/GivenDotnetListPackage.cs:line 120 Testing on the build machine is advised against, this test had a segmentation fault and should be investigated by the test owner. --- @livarcocc commented on [Tue Feb 19 2019](https://github.com/dotnet/cli/issues/10832#issuecomment-465268656) This is a seg fault. The CLI is purelly managed. So, this must be the host crashing. Please move the issue to core-setup.
non_process
toolserror build failures in prodcon cli release robertborr commented on dotnet mc bot commented on there were a set of failures during this build here is a summary of these agent error log total tests passed failed skipped robertborr robertborr commented on runtest executing users buildagent agent work s bin osx dotnet dotnet restore disable parallel in pwd users buildagent agent work s bin osx test dotnet list package tests itdoesnotacceptinvalidframework msbuildappwithmultipleframeworks runtest vstest exit code runtest microsoft dotnet cli list package tests givendotnetlistpackage assetspathexistsbutnotrestored failed microsoft dotnet cli list package tests givendotnetlistpackage assetspathexistsbutnotrestored dotnettest error message expected command to pass but it did not file name users buildagent agent work s bin osx dotnet dotnet arguments list users buildagent agent work s bin osx test dotnet list package tests assetspathexistsbutnotrestored newtonsoftdependentproject package workingdir exit code stdout stderr stack trace at fluentassertions execution throw string message at fluentassertions execution assertionscope failwith string message object args at microsoft dotnet tools test utilities commandresultassertions pass in users buildagent agent work s test microsoft dotnet tools tests utilities assertions commandresultassertions cs line at microsoft dotnet cli list package tests givendotnetlistpackage assetspathexistsbutnotrestored in users buildagent agent work s test dotnet list package tests givendotnetlistpackage cs line testing on the build machine is advised against this test had a segmentation fault and should be investigated by the test owner livarcocc commented on this is a seg fault the cli is purelly managed so this must be the host crashing please move the issue to core setup
0
240
2,664,624,975
IssuesEvent
2015-03-20 15:36:48
AnalyticalGraphicsInc/cesium
https://api.github.com/repos/AnalyticalGraphicsInc/cesium
opened
Pro-active dead link checking
dev process doc
We should look into adding a build target to use a tool that checks the documentation for bad links. We can run this on the generated documentation rather than the source. Here's an example of one such project: https://github.com/wummel/linkchecker but I'm sure there are others that might better suit our current toolsets.
1.0
Pro-active dead link checking - We should look into adding a build target to use a tool that checks the documentation for bad links. We can run this on the generated documentation rather than the source. Here's an example of one such project: https://github.com/wummel/linkchecker but I'm sure there are others that might better suit our current toolsets.
process
pro active dead link checking we should look into adding a build target to use a tool that checks the documentation for bad links we can run this on the generated documentation rather than the source here s an example of one such project but i m sure there are others that might better suit our current toolsets
1
3,011
6,016,610,437
IssuesEvent
2017-06-07 07:30:23
openvstorage/framework
https://api.github.com/repos/openvstorage/framework
closed
Allow a different nsm maxload for fragment/block namespaces
process_wontfix type_enhancement
Currently the limit of namespaces per NSM cluster is 75. for block cache and fragment cache name namspaces this limit could be higher as they typically contain less fragments.
1.0
Allow a different nsm maxload for fragment/block namespaces - Currently the limit of namespaces per NSM cluster is 75. for block cache and fragment cache name namspaces this limit could be higher as they typically contain less fragments.
process
allow a different nsm maxload for fragment block namespaces currently the limit of namespaces per nsm cluster is for block cache and fragment cache name namspaces this limit could be higher as they typically contain less fragments
1
57,274
6,542,136,529
IssuesEvent
2017-09-02 01:06:19
freeCodeCamp/freeCodeCamp
https://api.github.com/repos/freeCodeCamp/freeCodeCamp
closed
Tests now include tails
discussing tests
I'm working on a PR https://github.com/FreeCodeCamp/FreeCodeCamp/pull/11987 but the third test I'm adding (using `code.match()` to test for the number of times `quotient` is used in the editor will not work because the tests now incorporate the tail as referenced in #10258 . If I write the test to include the tails (`length === 3` instead of `length === 1`) then `npm run test-challenges` won't work because the solution only uses `quotient` once. So it would appear the only option is to remove the solution provided in the JSON and account for the tail. @BerkeleyTrue is that the right approach given tails are now included in the tests?
1.0
Tests now include tails - I'm working on a PR https://github.com/FreeCodeCamp/FreeCodeCamp/pull/11987 but the third test I'm adding (using `code.match()` to test for the number of times `quotient` is used in the editor will not work because the tests now incorporate the tail as referenced in #10258 . If I write the test to include the tails (`length === 3` instead of `length === 1`) then `npm run test-challenges` won't work because the solution only uses `quotient` once. So it would appear the only option is to remove the solution provided in the JSON and account for the tail. @BerkeleyTrue is that the right approach given tails are now included in the tests?
non_process
tests now include tails i m working on a pr but the third test i m adding using code match to test for the number of times quotient is used in the editor will not work because the tests now incorporate the tail as referenced in if i write the test to include the tails length instead of length then npm run test challenges won t work because the solution only uses quotient once so it would appear the only option is to remove the solution provided in the json and account for the tail berkeleytrue is that the right approach given tails are now included in the tests
0
309,833
26,679,658,680
IssuesEvent
2023-01-26 16:41:43
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
pkg/ccl/workloadccl/workloadccl_test: TestImportFixture failed
C-test-failure O-robot branch-master T-testeng
pkg/ccl/workloadccl/workloadccl_test.TestImportFixture [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/8456234?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/8456234?buildTab=artifacts#/) on master @ [2ad8df3df3272110705984efc32f1453631ce602](https://github.com/cockroachdb/cockroach/commits/2ad8df3df3272110705984efc32f1453631ce602): ``` github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:698 +0x19a github.com/cockroachdb/cockroach/pkg/sql.(*createStatsNode).startJob() github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:153 +0x30e github.com/cockroachdb/cockroach/pkg/sql.(*createStatsNode).startExec.func1() github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:116 +0x86 Goroutine 289 (running) created at: github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTaskEx() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:461 +0x619 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTask() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:332 +0x1cb github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.GRPCTransportFactory() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/transport_race.go:98 +0x161 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).sendToReplicas() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:2060 +0xd0d github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).sendPartialBatch() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1668 +0xa44 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).divideAndSendBatchToRanges() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1240 +0x592 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*RangeIterator).Seek() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/range_iter.go:208 +0x73a github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).divideAndSendBatchToRanges() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1234 +0x2b7 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).Send() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:861 +0xa59 github.com/cockroachdb/cockroach/pkg/kv.lookupRangeFwdScan() github.com/cockroachdb/cockroach/pkg/kv/range_lookup.go:330 +0x832 github.com/cockroachdb/cockroach/pkg/kv.RangeLookup() github.com/cockroachdb/cockroach/pkg/kv/range_lookup.go:205 +0x315 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).RangeLookup() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:570 +0x128 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.(*RangeCache).performRangeLookup() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:1032 +0x3fe github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.tryLookupImpl.func1() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:920 +0xc5 github.com/cockroachdb/cockroach/pkg/util/contextutil.RunWithTimeout() github.com/cockroachdb/cockroach/pkg/util/contextutil/context.go:104 +0x1a9 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.tryLookupImpl() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:917 +0x1a8 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.(*RangeCache).tryLookup.func3() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:815 +0xd9 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).doCall.func1() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:387 +0x51 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunTask() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:305 +0x147 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).doCall() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:386 +0x2a4 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).DoChan.func1() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:356 +0xd0 ================== ``` <p>Parameters: <code>TAGS=bazel,gss,race</code> </p> <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/test-eng @cockroachdb/sql-sessions <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestImportFixture.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-23828
2.0
pkg/ccl/workloadccl/workloadccl_test: TestImportFixture failed - pkg/ccl/workloadccl/workloadccl_test.TestImportFixture [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/8456234?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/8456234?buildTab=artifacts#/) on master @ [2ad8df3df3272110705984efc32f1453631ce602](https://github.com/cockroachdb/cockroach/commits/2ad8df3df3272110705984efc32f1453631ce602): ``` github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:698 +0x19a github.com/cockroachdb/cockroach/pkg/sql.(*createStatsNode).startJob() github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:153 +0x30e github.com/cockroachdb/cockroach/pkg/sql.(*createStatsNode).startExec.func1() github.com/cockroachdb/cockroach/pkg/sql/create_stats.go:116 +0x86 Goroutine 289 (running) created at: github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTaskEx() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:461 +0x619 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTask() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:332 +0x1cb github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.GRPCTransportFactory() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/transport_race.go:98 +0x161 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).sendToReplicas() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:2060 +0xd0d github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).sendPartialBatch() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1668 +0xa44 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).divideAndSendBatchToRanges() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1240 +0x592 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*RangeIterator).Seek() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/range_iter.go:208 +0x73a github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).divideAndSendBatchToRanges() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:1234 +0x2b7 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).Send() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:861 +0xa59 github.com/cockroachdb/cockroach/pkg/kv.lookupRangeFwdScan() github.com/cockroachdb/cockroach/pkg/kv/range_lookup.go:330 +0x832 github.com/cockroachdb/cockroach/pkg/kv.RangeLookup() github.com/cockroachdb/cockroach/pkg/kv/range_lookup.go:205 +0x315 github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord.(*DistSender).RangeLookup() github.com/cockroachdb/cockroach/pkg/kv/kvclient/kvcoord/dist_sender.go:570 +0x128 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.(*RangeCache).performRangeLookup() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:1032 +0x3fe github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.tryLookupImpl.func1() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:920 +0xc5 github.com/cockroachdb/cockroach/pkg/util/contextutil.RunWithTimeout() github.com/cockroachdb/cockroach/pkg/util/contextutil/context.go:104 +0x1a9 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.tryLookupImpl() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:917 +0x1a8 github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache.(*RangeCache).tryLookup.func3() github.com/cockroachdb/cockroach/pkg/kv/kvclient/rangecache/range_cache.go:815 +0xd9 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).doCall.func1() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:387 +0x51 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunTask() github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:305 +0x147 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).doCall() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:386 +0x2a4 github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight.(*Group).DoChan.func1() github.com/cockroachdb/cockroach/pkg/util/syncutil/singleflight/singleflight.go:356 +0xd0 ================== ``` <p>Parameters: <code>TAGS=bazel,gss,race</code> </p> <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/test-eng @cockroachdb/sql-sessions <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestImportFixture.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-23828
non_process
pkg ccl workloadccl workloadccl test testimportfixture failed pkg ccl workloadccl workloadccl test testimportfixture with on master github com cockroachdb cockroach pkg sql create stats go github com cockroachdb cockroach pkg sql createstatsnode startjob github com cockroachdb cockroach pkg sql create stats go github com cockroachdb cockroach pkg sql createstatsnode startexec github com cockroachdb cockroach pkg sql create stats go goroutine running created at github com cockroachdb cockroach pkg util stop stopper runasynctaskex github com cockroachdb cockroach pkg util stop stopper go github com cockroachdb cockroach pkg util stop stopper runasynctask github com cockroachdb cockroach pkg util stop stopper go github com cockroachdb cockroach pkg kv kvclient kvcoord grpctransportfactory github com cockroachdb cockroach pkg kv kvclient kvcoord transport race go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender sendtoreplicas github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender sendpartialbatch github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender divideandsendbatchtoranges github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv kvclient kvcoord rangeiterator seek github com cockroachdb cockroach pkg kv kvclient kvcoord range iter go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender divideandsendbatchtoranges github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender send github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv lookuprangefwdscan github com cockroachdb cockroach pkg kv range lookup go github com cockroachdb cockroach pkg kv rangelookup github com cockroachdb cockroach pkg kv range lookup go github com cockroachdb cockroach pkg kv kvclient kvcoord distsender rangelookup github com cockroachdb cockroach pkg kv kvclient kvcoord dist sender go github com cockroachdb cockroach pkg kv kvclient rangecache rangecache performrangelookup github com cockroachdb cockroach pkg kv kvclient rangecache range cache go github com cockroachdb cockroach pkg kv kvclient rangecache trylookupimpl github com cockroachdb cockroach pkg kv kvclient rangecache range cache go github com cockroachdb cockroach pkg util contextutil runwithtimeout github com cockroachdb cockroach pkg util contextutil context go github com cockroachdb cockroach pkg kv kvclient rangecache trylookupimpl github com cockroachdb cockroach pkg kv kvclient rangecache range cache go github com cockroachdb cockroach pkg kv kvclient rangecache rangecache trylookup github com cockroachdb cockroach pkg kv kvclient rangecache range cache go github com cockroachdb cockroach pkg util syncutil singleflight group docall github com cockroachdb cockroach pkg util syncutil singleflight singleflight go github com cockroachdb cockroach pkg util stop stopper runtask github com cockroachdb cockroach pkg util stop stopper go github com cockroachdb cockroach pkg util syncutil singleflight group docall github com cockroachdb cockroach pkg util syncutil singleflight singleflight go github com cockroachdb cockroach pkg util syncutil singleflight group dochan github com cockroachdb cockroach pkg util syncutil singleflight singleflight go parameters tags bazel gss race help see also cc cockroachdb test eng cockroachdb sql sessions jira issue crdb
0
15,679
19,847,723,885
IssuesEvent
2022-01-21 08:48:15
ooi-data/RS03INT2-MJ03D-06-BOTPTA303-streamed-botpt_nano_sample_15s
https://api.github.com/repos/ooi-data/RS03INT2-MJ03D-06-BOTPTA303-streamed-botpt_nano_sample_15s
opened
🛑 Processing failed: InvalidIndexError
process
## Overview `InvalidIndexError` found in `processing_task` task during run ended on 2022-01-21T08:48:14.479559. ## Details Flow name: `RS03INT2-MJ03D-06-BOTPTA303-streamed-botpt_nano_sample_15s` Task name: `processing_task` Error type: `InvalidIndexError` Error message: Reindexing only valid with uniquely valued Index objects <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 182, in _append_zarr ds_to_append = ds_to_append.drop_isel({append_dim: 0}) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 4563, in drop_isel ds = ds.loc[dimension_index] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 563, in __getitem__ return self.dataset.sel(key) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 2504, in sel pos_indexers, new_indexes = remap_label_indexers( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/coordinates.py", line 421, in remap_label_indexers pos_indexers, new_indexes = indexing.remap_label_indexers( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 120, in remap_label_indexers idxr, new_idx = index.query(labels, method=method, tolerance=tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexes.py", line 240, in query indexer = get_indexer_nd(self.index, label, method, tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexes.py", line 142, in get_indexer_nd flat_indexer = index.get_indexer(flat_labels, method=method, tolerance=tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 3442, in get_indexer raise InvalidIndexError(self._requires_unique_msg) pandas.errors.InvalidIndexError: Reindexing only valid with uniquely valued Index objects ``` </details>
1.0
🛑 Processing failed: InvalidIndexError - ## Overview `InvalidIndexError` found in `processing_task` task during run ended on 2022-01-21T08:48:14.479559. ## Details Flow name: `RS03INT2-MJ03D-06-BOTPTA303-streamed-botpt_nano_sample_15s` Task name: `processing_task` Error type: `InvalidIndexError` Error message: Reindexing only valid with uniquely valued Index objects <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 182, in _append_zarr ds_to_append = ds_to_append.drop_isel({append_dim: 0}) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 4563, in drop_isel ds = ds.loc[dimension_index] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 563, in __getitem__ return self.dataset.sel(key) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 2504, in sel pos_indexers, new_indexes = remap_label_indexers( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/coordinates.py", line 421, in remap_label_indexers pos_indexers, new_indexes = indexing.remap_label_indexers( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 120, in remap_label_indexers idxr, new_idx = index.query(labels, method=method, tolerance=tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexes.py", line 240, in query indexer = get_indexer_nd(self.index, label, method, tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexes.py", line 142, in get_indexer_nd flat_indexer = index.get_indexer(flat_labels, method=method, tolerance=tolerance) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 3442, in get_indexer raise InvalidIndexError(self._requires_unique_msg) pandas.errors.InvalidIndexError: Reindexing only valid with uniquely valued Index objects ``` </details>
process
🛑 processing failed invalidindexerror overview invalidindexerror found in processing task task during run ended on details flow name streamed botpt nano sample task name processing task error type invalidindexerror error message reindexing only valid with uniquely valued index objects traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr ds to append ds to append drop isel append dim file srv conda envs notebook lib site packages xarray core dataset py line in drop isel ds ds loc file srv conda envs notebook lib site packages xarray core dataset py line in getitem return self dataset sel key file srv conda envs notebook lib site packages xarray core dataset py line in sel pos indexers new indexes remap label indexers file srv conda envs notebook lib site packages xarray core coordinates py line in remap label indexers pos indexers new indexes indexing remap label indexers file srv conda envs notebook lib site packages xarray core indexing py line in remap label indexers idxr new idx index query labels method method tolerance tolerance file srv conda envs notebook lib site packages xarray core indexes py line in query indexer get indexer nd self index label method tolerance file srv conda envs notebook lib site packages xarray core indexes py line in get indexer nd flat indexer index get indexer flat labels method method tolerance tolerance file srv conda envs notebook lib site packages pandas core indexes base py line in get indexer raise invalidindexerror self requires unique msg pandas errors invalidindexerror reindexing only valid with uniquely valued index objects
1
237,063
18,151,677,402
IssuesEvent
2021-09-26 11:23:48
girlscript/winter-of-contributing
https://api.github.com/repos/girlscript/winter-of-contributing
closed
Modular Arithmetic in C++
documentation GWOC21 Assigned C/CPP
<hr> ## Description 📜 <!-- Please describe the issue in brief. --> I want to Create an Documentation explaining the concepts of Modular Arithmetic in C++ with some code examples too. <hr> ## Domain of Contribution 📊 <!----Please delete options that are not relevant.And in order to tick the check box just but x inside them for example [x] like this-----> - [x] C/CPP <hr>
1.0
Modular Arithmetic in C++ - <hr> ## Description 📜 <!-- Please describe the issue in brief. --> I want to Create an Documentation explaining the concepts of Modular Arithmetic in C++ with some code examples too. <hr> ## Domain of Contribution 📊 <!----Please delete options that are not relevant.And in order to tick the check box just but x inside them for example [x] like this-----> - [x] C/CPP <hr>
non_process
modular arithmetic in c description 📜 i want to create an documentation explaining the concepts of modular arithmetic in c with some code examples too domain of contribution 📊 c cpp
0
6,427
9,530,838,874
IssuesEvent
2019-04-29 14:43:56
codefordenver/org
https://api.github.com/repos/codefordenver/org
closed
Create waffle card bookmarklet
Process Technical
As a member of CfD, I would like a quick way to create a waffle card based on an template, so that my waffle cards are of a consistent format, and I am reminded of useful info to fill out for others.
1.0
Create waffle card bookmarklet - As a member of CfD, I would like a quick way to create a waffle card based on an template, so that my waffle cards are of a consistent format, and I am reminded of useful info to fill out for others.
process
create waffle card bookmarklet as a member of cfd i would like a quick way to create a waffle card based on an template so that my waffle cards are of a consistent format and i am reminded of useful info to fill out for others
1
424,982
29,187,380,129
IssuesEvent
2023-05-19 16:35:40
randombit/botan
https://api.github.com/repos/randombit/botan
closed
Creating RSA keypair and Encrypting/Decrypting
usage question documentation
I have been having trouble locating a definitive guide or code examples to both create a RSA keypair and Encrypt/ decrypt using said keys. My first method was using the given AEAD mode example given in the wiki but when I tried changing the code up to work with an RSA key. I was unable to find documentation or tutorials on how to do so. What I'm attempting to do is: 1. Create a private and public RSA key. 2. Encrypt with public key. 3. Decrypt with private key. One of the few reasonable examples I could find of creating a RSA keypair is below which I then attempt to convert to a string, to then be passed to the encryption and decryption functions: ``` Botan::AutoSeeded_RNG rng; // Random number generator: Botan::RSA_PrivateKey keyPair(rng, 1024); std::string privatekey = Botan::base64_encode(Botan::PKCS8::BER_encode(keyPair)); std::string publickey = Botan::base64_encode(Botan::X509::BER_encode(keyPair)); std::cout << "private key: " << privatekey << std::endl << "public key: " << publickey << std::endl; return {privatekey, publickey}; ``` The encryption algorithm I was originally using was AES-256/OCB. I've looked at the documentation, but it seems to always point me elsewhere. Help with this would be appreciated alongside, some where to find examples of botan-2 aswell.
1.0
Creating RSA keypair and Encrypting/Decrypting - I have been having trouble locating a definitive guide or code examples to both create a RSA keypair and Encrypt/ decrypt using said keys. My first method was using the given AEAD mode example given in the wiki but when I tried changing the code up to work with an RSA key. I was unable to find documentation or tutorials on how to do so. What I'm attempting to do is: 1. Create a private and public RSA key. 2. Encrypt with public key. 3. Decrypt with private key. One of the few reasonable examples I could find of creating a RSA keypair is below which I then attempt to convert to a string, to then be passed to the encryption and decryption functions: ``` Botan::AutoSeeded_RNG rng; // Random number generator: Botan::RSA_PrivateKey keyPair(rng, 1024); std::string privatekey = Botan::base64_encode(Botan::PKCS8::BER_encode(keyPair)); std::string publickey = Botan::base64_encode(Botan::X509::BER_encode(keyPair)); std::cout << "private key: " << privatekey << std::endl << "public key: " << publickey << std::endl; return {privatekey, publickey}; ``` The encryption algorithm I was originally using was AES-256/OCB. I've looked at the documentation, but it seems to always point me elsewhere. Help with this would be appreciated alongside, some where to find examples of botan-2 aswell.
non_process
creating rsa keypair and encrypting decrypting i have been having trouble locating a definitive guide or code examples to both create a rsa keypair and encrypt decrypt using said keys my first method was using the given aead mode example given in the wiki but when i tried changing the code up to work with an rsa key i was unable to find documentation or tutorials on how to do so what i m attempting to do is create a private and public rsa key encrypt with public key decrypt with private key one of the few reasonable examples i could find of creating a rsa keypair is below which i then attempt to convert to a string to then be passed to the encryption and decryption functions botan autoseeded rng rng random number generator botan rsa privatekey keypair rng std string privatekey botan encode botan ber encode keypair std string publickey botan encode botan ber encode keypair std cout private key privatekey std endl public key publickey std endl return privatekey publickey the encryption algorithm i was originally using was aes ocb i ve looked at the documentation but it seems to always point me elsewhere help with this would be appreciated alongside some where to find examples of botan aswell
0
265,258
28,262,384,750
IssuesEvent
2023-04-07 01:16:32
hshivhare67/platform_device_renesas_kernel_v4.19.72
https://api.github.com/repos/hshivhare67/platform_device_renesas_kernel_v4.19.72
closed
CVE-2019-19056 (Medium) detected in linuxlinux-4.19.279 - autoclosed
Mend: dependency security vulnerability
## CVE-2019-19056 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.279</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_device_renesas_kernel_v4.19.72/commit/3f00c931cf51848ec37b8817097db058fcc2f3f7">3f00c931cf51848ec37b8817097db058fcc2f3f7</a></p> <p>Found in base branch: <b>main</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A memory leak in the mwifiex_pcie_alloc_cmdrsp_buf() function in drivers/net/wireless/marvell/mwifiex/pcie.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption) by triggering mwifiex_map_pci_memory() failures, aka CID-db8fd2cde932. <p>Publish Date: 2019-11-18 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19056>CVE-2019-19056</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-19056">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-19056</a></p> <p>Release Date: 2020-08-24</p> <p>Fix Resolution: v5.5-rc1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-19056 (Medium) detected in linuxlinux-4.19.279 - autoclosed - ## CVE-2019-19056 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.279</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_device_renesas_kernel_v4.19.72/commit/3f00c931cf51848ec37b8817097db058fcc2f3f7">3f00c931cf51848ec37b8817097db058fcc2f3f7</a></p> <p>Found in base branch: <b>main</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A memory leak in the mwifiex_pcie_alloc_cmdrsp_buf() function in drivers/net/wireless/marvell/mwifiex/pcie.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption) by triggering mwifiex_map_pci_memory() failures, aka CID-db8fd2cde932. <p>Publish Date: 2019-11-18 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19056>CVE-2019-19056</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-19056">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-19056</a></p> <p>Release Date: 2020-08-24</p> <p>Fix Resolution: v5.5-rc1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in linuxlinux autoclosed cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch main vulnerable source files vulnerability details a memory leak in the mwifiex pcie alloc cmdrsp buf function in drivers net wireless marvell mwifiex pcie c in the linux kernel through allows attackers to cause a denial of service memory consumption by triggering mwifiex map pci memory failures aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
634,211
20,356,231,849
IssuesEvent
2022-02-20 01:14:29
therealbluepandabear/PyxlMoose
https://api.github.com/repos/therealbluepandabear/PyxlMoose
closed
[Improvement] Make it so when the user is previewing the rectangle, it only shows a border of it
mid priority improvement
#### Improvement description Make it so when the user is moving their fingers to create/preview the rectangle, only show a border of it (just four lines), and fill in the rectangle only when the user lets go of their finger. #### Why is this improvement important to add? Because this will make the app quicker, efficient, and faster.
1.0
[Improvement] Make it so when the user is previewing the rectangle, it only shows a border of it - #### Improvement description Make it so when the user is moving their fingers to create/preview the rectangle, only show a border of it (just four lines), and fill in the rectangle only when the user lets go of their finger. #### Why is this improvement important to add? Because this will make the app quicker, efficient, and faster.
non_process
make it so when the user is previewing the rectangle it only shows a border of it improvement description make it so when the user is moving their fingers to create preview the rectangle only show a border of it just four lines and fill in the rectangle only when the user lets go of their finger why is this improvement important to add because this will make the app quicker efficient and faster
0
9,359
12,369,296,758
IssuesEvent
2020-05-18 15:03:28
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
0.17.1rc3 fails the test 18.04 (no JDK)
P2 team-Rules-Java type: process
Test of Bazel 0.17.1rc3: https://buildkite.com/bazel/bazel-bazel/builds/4403#03954ad5-9c7e-4155-a644-839f8566853f cc @lberki ``` ** test_bootstrap ************************************************************** readlink: missing operand Try 'readlink --help' for more information. Building Bazel from scratch ERROR: JDK not found, please set $JAVA_HOME. -- Test log: ----------------------------------------------------------- ```
1.0
0.17.1rc3 fails the test 18.04 (no JDK) - Test of Bazel 0.17.1rc3: https://buildkite.com/bazel/bazel-bazel/builds/4403#03954ad5-9c7e-4155-a644-839f8566853f cc @lberki ``` ** test_bootstrap ************************************************************** readlink: missing operand Try 'readlink --help' for more information. Building Bazel from scratch ERROR: JDK not found, please set $JAVA_HOME. -- Test log: ----------------------------------------------------------- ```
process
fails the test no jdk test of bazel cc lberki test bootstrap readlink missing operand try readlink help for more information building bazel from scratch error jdk not found please set java home test log
1
6,866
15,682,331,111
IssuesEvent
2021-03-25 07:07:56
bithyve/hexa
https://api.github.com/repos/bithyve/hexa
closed
PoC - Use SQLite with persist to compare peformance
1.5.1 Architecture DevTask NFR
Use sqlite with redux persist instead of AsyncStorage and compare performance on a low end android and ios device
1.0
PoC - Use SQLite with persist to compare peformance - Use sqlite with redux persist instead of AsyncStorage and compare performance on a low end android and ios device
non_process
poc use sqlite with persist to compare peformance use sqlite with redux persist instead of asyncstorage and compare performance on a low end android and ios device
0
1,095
3,563,642,574
IssuesEvent
2016-01-25 05:44:28
kerubistan/kerub
https://api.github.com/repos/kerubistan/kerub
opened
find a way in infinispan to query by UUID
cleanup component:data processing priority: low
Infinispan does not take UUID field as primitive (which is right) but because of that it does not want to perform 'eq' comparison with it (which is wrong)
1.0
find a way in infinispan to query by UUID - Infinispan does not take UUID field as primitive (which is right) but because of that it does not want to perform 'eq' comparison with it (which is wrong)
process
find a way in infinispan to query by uuid infinispan does not take uuid field as primitive which is right but because of that it does not want to perform eq comparison with it which is wrong
1
36,849
8,166,693,549
IssuesEvent
2018-08-25 12:44:30
cython/cython
https://api.github.com/repos/cython/cython
closed
Python 3.7, Windows: FAIL: with_outer_raising (pure_doctest__generators_py)
Python3 Semantics defect
A single test error when trying to generate Cython 0.28.3 Windows wheels for Python 3.7: ``` ====================================================================== FAIL: with_outer_raising (pure_doctest__generators_py) Doctest: pure_doctest__generators_py.with_outer_raising ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python37\lib\doctest.py", line 2198, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for pure_doctest__generators_py.with_outer_raising File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 84, in with_outer_raising ---------------------------------------------------------------------- File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 87, in pure_doctest__generators_py.with_outer_raising Failed example: list(x()) Exception raised: Traceback (most recent call last): File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 93, in generator raise StopIteration StopIteration The above exception was the direct cause of the following exception: Traceback (most recent call last): File "C:\Python37\lib\doctest.py", line 1329, in __run compileflags, 1), test.globs) File "<doctest pure_doctest__generators_py.with_outer_raising[1]>", line 1, in <module> list(x()) RuntimeError: generator raised StopIteration ``` https://ci.appveyor.com/project/matthew-brett/cython-wheels/build/job/hw0umryi37mobb67#L18595
1.0
Python 3.7, Windows: FAIL: with_outer_raising (pure_doctest__generators_py) - A single test error when trying to generate Cython 0.28.3 Windows wheels for Python 3.7: ``` ====================================================================== FAIL: with_outer_raising (pure_doctest__generators_py) Doctest: pure_doctest__generators_py.with_outer_raising ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python37\lib\doctest.py", line 2198, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for pure_doctest__generators_py.with_outer_raising File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 84, in with_outer_raising ---------------------------------------------------------------------- File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 87, in pure_doctest__generators_py.with_outer_raising Failed example: list(x()) Exception raised: Traceback (most recent call last): File "C:\projects\cython-wheels\Cython\tests\run\generators_py.py", line 93, in generator raise StopIteration StopIteration The above exception was the direct cause of the following exception: Traceback (most recent call last): File "C:\Python37\lib\doctest.py", line 1329, in __run compileflags, 1), test.globs) File "<doctest pure_doctest__generators_py.with_outer_raising[1]>", line 1, in <module> list(x()) RuntimeError: generator raised StopIteration ``` https://ci.appveyor.com/project/matthew-brett/cython-wheels/build/job/hw0umryi37mobb67#L18595
non_process
python windows fail with outer raising pure doctest generators py a single test error when trying to generate cython windows wheels for python fail with outer raising pure doctest generators py doctest pure doctest generators py with outer raising traceback most recent call last file c lib doctest py line in runtest raise self failureexception self format failure new getvalue assertionerror failed doctest test for pure doctest generators py with outer raising file c projects cython wheels cython tests run generators py py line in with outer raising file c projects cython wheels cython tests run generators py py line in pure doctest generators py with outer raising failed example list x exception raised traceback most recent call last file c projects cython wheels cython tests run generators py py line in generator raise stopiteration stopiteration the above exception was the direct cause of the following exception traceback most recent call last file c lib doctest py line in run compileflags test globs file line in list x runtimeerror generator raised stopiteration
0
83,437
24,055,870,680
IssuesEvent
2022-09-16 16:48:51
gradle/gradle
https://api.github.com/repos/gradle/gradle
closed
Improve error message when more than one build in the composite has the same name
a:feature in:composite-builds stale
@adammurdoch commented on [Tue Jul 04 2017](https://github.com/gradle/composite-builds/issues/118) Currently the error message is something like "Included build '&lt;name>' is not unique in composite". The error should at least let the user know _which_ builds have the same name, and ideally what can be done about this.
1.0
Improve error message when more than one build in the composite has the same name - @adammurdoch commented on [Tue Jul 04 2017](https://github.com/gradle/composite-builds/issues/118) Currently the error message is something like "Included build '&lt;name>' is not unique in composite". The error should at least let the user know _which_ builds have the same name, and ideally what can be done about this.
non_process
improve error message when more than one build in the composite has the same name adammurdoch commented on currently the error message is something like included build lt name is not unique in composite the error should at least let the user know which builds have the same name and ideally what can be done about this
0
134,107
18,422,567,427
IssuesEvent
2021-10-13 17:48:38
turkdevops/desktop
https://api.github.com/repos/turkdevops/desktop
closed
CVE-2018-19838 (Medium) detected in node-sass-4.13.1.tgz - autoclosed
security vulnerability
## CVE-2018-19838 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.13.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p> <p>Path to dependency file: desktop/package.json</p> <p>Path to vulnerable library: /node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.13.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/turkdevops/desktop/commit/f98805bf0d52a3184c0b8034143f7f0b5d10f1eb">f98805bf0d52a3184c0b8034143f7f0b5d10f1eb</a></p> <p>Found in base branch: <b>development</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass prior to 3.5.5, functions inside ast.cpp for IMPLEMENT_AST_OPERATORS expansion allow attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, as demonstrated by recursive calls involving clone(), cloneChildren(), and copy(). <p>Publish Date: 2018-12-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19838>CVE-2018-19838</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/sass/libsass/releases/tag/3.5.5">https://github.com/sass/libsass/releases/tag/3.5.5</a></p> <p>Release Date: 2018-12-04</p> <p>Fix Resolution: libsass - 3.5.5;node-sass - 4.14.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-19838 (Medium) detected in node-sass-4.13.1.tgz - autoclosed - ## CVE-2018-19838 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.13.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p> <p>Path to dependency file: desktop/package.json</p> <p>Path to vulnerable library: /node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.13.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/turkdevops/desktop/commit/f98805bf0d52a3184c0b8034143f7f0b5d10f1eb">f98805bf0d52a3184c0b8034143f7f0b5d10f1eb</a></p> <p>Found in base branch: <b>development</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass prior to 3.5.5, functions inside ast.cpp for IMPLEMENT_AST_OPERATORS expansion allow attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, as demonstrated by recursive calls involving clone(), cloneChildren(), and copy(). <p>Publish Date: 2018-12-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19838>CVE-2018-19838</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/sass/libsass/releases/tag/3.5.5">https://github.com/sass/libsass/releases/tag/3.5.5</a></p> <p>Release Date: 2018-12-04</p> <p>Fix Resolution: libsass - 3.5.5;node-sass - 4.14.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in node sass tgz autoclosed cve medium severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file desktop package json path to vulnerable library node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href found in base branch development vulnerability details in libsass prior to functions inside ast cpp for implement ast operators expansion allow attackers to cause a denial of service resulting from stack consumption via a crafted sass file as demonstrated by recursive calls involving clone clonechildren and copy publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass node sass step up your open source security game with whitesource
0
411,002
27,809,801,998
IssuesEvent
2023-03-18 01:46:45
amy-langley/tracking-trans-hate-bills
https://api.github.com/repos/amy-langley/tracking-trans-hate-bills
opened
Update readme
documentation
The readme doesn't talk about almost anything that I've been working on recently so it probably needs to be updated to talk about thos.e
1.0
Update readme - The readme doesn't talk about almost anything that I've been working on recently so it probably needs to be updated to talk about thos.e
non_process
update readme the readme doesn t talk about almost anything that i ve been working on recently so it probably needs to be updated to talk about thos e
0
2,162
5,008,954,214
IssuesEvent
2016-12-12 20:59:32
bongo227/StatsNotes
https://api.github.com/repos/bongo227/StatsNotes
opened
Construction and use of charts for mean, range, standard deviation and proportions.
15.3 Statistical Process Control
From the spec: Where a target value is given, this should be used as the centre-line of the chart for means. Advantages and disadvantages of using attributes or variables.
1.0
Construction and use of charts for mean, range, standard deviation and proportions. - From the spec: Where a target value is given, this should be used as the centre-line of the chart for means. Advantages and disadvantages of using attributes or variables.
process
construction and use of charts for mean range standard deviation and proportions from the spec where a target value is given this should be used as the centre line of the chart for means advantages and disadvantages of using attributes or variables
1
254,625
27,396,878,564
IssuesEvent
2023-02-28 20:28:05
therealjeffbeck/WebGoat
https://api.github.com/repos/therealjeffbeck/WebGoat
closed
jquery-3.4.1.min.js: 2 vulnerabilities (highest severity is: 6.1) - autoclosed
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.4.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jquery version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2020-11023](https://www.mend.io/vulnerability-database/CVE-2020-11023) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-3.4.1.min.js | Direct | jquery - 3.5.0;jquery-rails - 4.4.0 | &#10060; | | [CVE-2020-11022](https://www.mend.io/vulnerability-database/CVE-2020-11022) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-3.4.1.min.js | Direct | jQuery - 3.5.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11023</summary> ### Vulnerable Library - <b>jquery-3.4.1.min.js</b></p> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-3.4.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11023>CVE-2020-11023</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440">https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jquery - 3.5.0;jquery-rails - 4.4.0</p> </p> <p></p> </details><details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11022</summary> ### Vulnerable Library - <b>jquery-3.4.1.min.js</b></p> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-3.4.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11022>CVE-2020-11022</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> <p></p> </details>
True
jquery-3.4.1.min.js: 2 vulnerabilities (highest severity is: 6.1) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.4.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jquery version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2020-11023](https://www.mend.io/vulnerability-database/CVE-2020-11023) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-3.4.1.min.js | Direct | jquery - 3.5.0;jquery-rails - 4.4.0 | &#10060; | | [CVE-2020-11022](https://www.mend.io/vulnerability-database/CVE-2020-11022) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-3.4.1.min.js | Direct | jQuery - 3.5.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11023</summary> ### Vulnerable Library - <b>jquery-3.4.1.min.js</b></p> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-3.4.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11023>CVE-2020-11023</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440">https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jquery - 3.5.0;jquery-rails - 4.4.0</p> </p> <p></p> </details><details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11022</summary> ### Vulnerable Library - <b>jquery-3.4.1.min.js</b></p> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.1/jquery.min.js</a></p> <p>Path to vulnerable library: /src/main/resources/webgoat/static/js/libs/jquery.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-3.4.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/therealjeffbeck/WebGoat/commit/b3784b7dd97b4bce5eb110f3778736a195d5407f">b3784b7dd97b4bce5eb110f3778736a195d5407f</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11022>CVE-2020-11022</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.1</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> <p></p> </details>
non_process
jquery min js vulnerabilities highest severity is autoclosed vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library src main resources webgoat static js libs jquery min js found in head commit a href vulnerabilities cve severity cvss dependency type fixed in jquery version remediation available medium jquery min js direct jquery jquery rails medium jquery min js direct jquery details cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library src main resources webgoat static js libs jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch develop vulnerability details in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery jquery rails cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library src main resources webgoat static js libs jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch develop vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery
0
88,873
10,582,170,188
IssuesEvent
2019-10-08 10:51:52
SenseNet/sensenet.github.io
https://api.github.com/repos/SenseNet/sensenet.github.io
closed
Port documentation from the Wiki: Date Time Field
documentation hacktoberfest help wanted
Port old article from wiki and refresh them for sensenet 7. The new article should be an .md file in the /_docs folder. Old screenshots should be re-created on the new admin-ui if it is possible and placed into the /docs/images folder. [admin.sensenet.com](https://admin.sensenet.com) name: BuiltIn\admin password: admin repository url: https://dev.demo.sensenet.com The .md file should have the following header: ``` --- title: %title% source_url: %url of the file on github% category: Development version: v7.0.0 tags: %possible tags as an array% --- ``` (you can use [the docs here](https://github.com/SenseNet/sensenet.github.io/tree/master/_docs) as example) [Date Time field](http://wiki.sensenet.com/DateTime_Field) Please change the old wiki urls as if they were available on github (e.g. https://wiki.sensenet.com/content_templates sould be changed to /content-templates)
1.0
Port documentation from the Wiki: Date Time Field - Port old article from wiki and refresh them for sensenet 7. The new article should be an .md file in the /_docs folder. Old screenshots should be re-created on the new admin-ui if it is possible and placed into the /docs/images folder. [admin.sensenet.com](https://admin.sensenet.com) name: BuiltIn\admin password: admin repository url: https://dev.demo.sensenet.com The .md file should have the following header: ``` --- title: %title% source_url: %url of the file on github% category: Development version: v7.0.0 tags: %possible tags as an array% --- ``` (you can use [the docs here](https://github.com/SenseNet/sensenet.github.io/tree/master/_docs) as example) [Date Time field](http://wiki.sensenet.com/DateTime_Field) Please change the old wiki urls as if they were available on github (e.g. https://wiki.sensenet.com/content_templates sould be changed to /content-templates)
non_process
port documentation from the wiki date time field port old article from wiki and refresh them for sensenet the new article should be an md file in the docs folder old screenshots should be re created on the new admin ui if it is possible and placed into the docs images folder name builtin admin password admin repository url the md file should have the following header title title source url url of the file on github category development version tags possible tags as an array you can use as example please change the old wiki urls as if they were available on github e g sould be changed to content templates
0
21,482
29,521,103,603
IssuesEvent
2023-06-05 02:00:07
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Mon, 5 Jun 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events There is no result ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP There is no result ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### 4DSR-GCN: 4D Video Point Cloud Upsampling using Graph Convolutional Networks - **Authors:** Lorenzo Berlincioni, Stefano Berretti, Marco Bertini, Alberto Del Bimbo - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Multimedia (cs.MM) - **Arxiv link:** https://arxiv.org/abs/2306.01081 - **Pdf link:** https://arxiv.org/pdf/2306.01081 - **Abstract** Time varying sequences of 3D point clouds, or 4D point clouds, are now being acquired at an increasing pace in several applications (e.g., LiDAR in autonomous or assisted driving). In many cases, such volume of data is transmitted, thus requiring that proper compression tools are applied to either reduce the resolution or the bandwidth. In this paper, we propose a new solution for upscaling and restoration of time-varying 3D video point clouds after they have been heavily compressed. In consideration of recent growing relevance of 3D applications, %We focused on a model allowing user-side upscaling and artifact removal for 3D video point clouds, a real-time stream of which would require . Our model consists of a specifically designed Graph Convolutional Network (GCN) that combines Dynamic Edge Convolution and Graph Attention Networks for feature aggregation in a Generative Adversarial setting. By taking inspiration PointNet++, We present a different way to sample dense point clouds with the intent to make these modules work in synergy to provide each node enough features about its neighbourhood in order to later on generate new vertices. Compared to other solutions in the literature that address the same task, our proposed model is capable of obtaining comparable results in terms of quality of the reconstruction, while using a substantially lower number of parameters (about 300KB), making our solution deployable in edge computing devices such as LiDAR. ### Reconstruction Distortion of Learned Image Compression with Imperceptible Perturbations - **Authors:** Yang Sui, Zhuohang Li, Ding Ding, Xiang Pan, Xiaozhong Xu, Shan Liu, Zhenzhong Chen - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2306.01125 - **Pdf link:** https://arxiv.org/pdf/2306.01125 - **Abstract** Learned Image Compression (LIC) has recently become the trending technique for image transmission due to its notable performance. Despite its popularity, the robustness of LIC with respect to the quality of image reconstruction remains under-explored. In this paper, we introduce an imperceptible attack approach designed to effectively degrade the reconstruction quality of LIC, resulting in the reconstructed image being severely disrupted by noise where any object in the reconstructed images is virtually impossible. More specifically, we generate adversarial examples by introducing a Frobenius norm-based loss function to maximize the discrepancy between original images and reconstructed adversarial examples. Further, leveraging the insensitivity of high-frequency components to human vision, we introduce Imperceptibility Constraint (IC) to ensure that the perturbations remain inconspicuous. Experiments conducted on the Kodak dataset using various LIC models demonstrate effectiveness. In addition, we provide several findings and suggestions for designing future defenses. ### DWT-CompCNN: Deep Image Classification Network for High Throughput JPEG 2000 Compressed Documents - **Authors:** Tejasvee Bisen, Mohammed Javed, Shashank Kirtania, P. Nagabhushan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Information Retrieval (cs.IR); Machine Learning (cs.LG); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2306.01359 - **Pdf link:** https://arxiv.org/pdf/2306.01359 - **Abstract** For any digital application with document images such as retrieval, the classification of document images becomes an essential stage. Conventionally for the purpose, the full versions of the documents, that is the uncompressed document images make the input dataset, which poses a threat due to the big volume required to accommodate the full versions of the documents. Therefore, it would be novel, if the same classification task could be accomplished directly (with some partial decompression) with the compressed representation of documents in order to make the whole process computationally more efficient. In this research work, a novel deep learning model, DWT CompCNN is proposed for classification of documents that are compressed using High Throughput JPEG 2000 (HTJ2K) algorithm. The proposed DWT-CompCNN comprises of five convolutional layers with filter sizes of 16, 32, 64, 128, and 256 consecutively for each increasing layer to improve learning from the wavelet coefficients extracted from the compressed images. Experiments are performed on two benchmark datasets- Tobacco-3482 and RVL-CDIP, which demonstrate that the proposed model is time and space efficient, and also achieves a better classification accuracy in compressed domain. ### Group channel pruning and spatial attention distilling for object detection - **Authors:** Yun Chu, Pu Li, Yong Bai, Zhuhua Hu, Yongqing Chen, Jiafeng Lu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2306.01526 - **Pdf link:** https://arxiv.org/pdf/2306.01526 - **Abstract** Due to the over-parameterization of neural networks, many model compression methods based on pruning and quantization have emerged. They are remarkable in reducing the size, parameter number, and computational complexity of the model. However, most of the models compressed by such methods need the support of special hardware and software, which increases the deployment cost. Moreover, these methods are mainly used in classification tasks, and rarely directly used in detection tasks. To address these issues, for the object detection network we introduce a three-stage model compression method: dynamic sparse training, group channel pruning, and spatial attention distilling. Firstly, to select out the unimportant channels in the network and maintain a good balance between sparsity and accuracy, we put forward a dynamic sparse training method, which introduces a variable sparse rate, and the sparse rate will change with the training process of the network. Secondly, to reduce the effect of pruning on network accuracy, we propose a novel pruning method called group channel pruning. In particular, we divide the network into multiple groups according to the scales of the feature layer and the similarity of module structure in the network, and then we use different pruning thresholds to prune the channels in each group. Finally, to recover the accuracy of the pruned network, we use an improved knowledge distillation method for the pruned network. Especially, we extract spatial attention information from the feature maps of specific scales in each group as knowledge for distillation. In the experiments, we use YOLOv4 as the object detection network and PASCAL VOC as the training dataset. Our method reduces the parameters of the model by 64.7 % and the calculation by 34.9%. ## Keyword: RAW ### Pedestrian Crossing Action Recognition and Trajectory Prediction with 3D Human Keypoints - **Authors:** Jiachen Li, Xinwei Shi, Feiyu Chen, Jonathan Stroud, Zhishuai Zhang, Tian Lan, Junhua Mao, Jeonhyung Kang, Khaled S. Refaat, Weilong Yang, Eugene Ie, Congcong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG); Robotics (cs.RO) - **Arxiv link:** https://arxiv.org/abs/2306.01075 - **Pdf link:** https://arxiv.org/pdf/2306.01075 - **Abstract** Accurate understanding and prediction of human behaviors are critical prerequisites for autonomous vehicles, especially in highly dynamic and interactive scenarios such as intersections in dense urban areas. In this work, we aim at identifying crossing pedestrians and predicting their future trajectories. To achieve these goals, we not only need the context information of road geometry and other traffic participants but also need fine-grained information of the human pose, motion and activity, which can be inferred from human keypoints. In this paper, we propose a novel multi-task learning framework for pedestrian crossing action recognition and trajectory prediction, which utilizes 3D human keypoints extracted from raw sensor data to capture rich information on human pose and activity. Moreover, we propose to apply two auxiliary tasks and contrastive learning to enable auxiliary supervisions to improve the learned keypoints representation, which further enhances the performance of major tasks. We validate our approach on a large-scale in-house dataset, as well as a public benchmark dataset, and show that our approach achieves state-of-the-art performance on a wide range of evaluation metrics. The effectiveness of each model component is validated in a detailed ablation study. ### Quantifying Sample Anonymity in Score-Based Generative Models with Adversarial Fingerprinting - **Authors:** Mischa Dombrowski, Bernhard Kainz - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01363 - **Pdf link:** https://arxiv.org/pdf/2306.01363 - **Abstract** Recent advances in score-based generative models have led to a huge spike in the development of downstream applications using generative models ranging from data augmentation over image and video generation to anomaly detection. Despite publicly available trained models, their potential to be used for privacy preserving data sharing has not been fully explored yet. Training diffusion models on private data and disseminating the models and weights rather than the raw dataset paves the way for innovative large-scale data-sharing strategies, particularly in healthcare, where safeguarding patients' personal health information is paramount. However, publishing such models without individual consent of, e.g., the patients from whom the data was acquired, necessitates guarantees that identifiable training samples will never be reproduced, thus protecting personal health data and satisfying the requirements of policymakers and regulatory bodies. This paper introduces a method for estimating the upper bound of the probability of reproducing identifiable training images during the sampling process. This is achieved by designing an adversarial approach that searches for anatomic fingerprints, such as medical devices or dermal art, which could potentially be employed to re-identify training images. Our method harnesses the learned score-based model to estimate the probability of the entire subspace of the score function that may be utilized for one-to-one reproduction of training samples. To validate our estimates, we generate anomalies containing a fingerprint and investigate whether generated samples from trained generative models can be uniquely mapped to the original training samples. Overall our results show that privacy-breaching images are reproduced at sampling time if the models were trained without care. ### Learning Landmarks Motion from Speech for Speaker-Agnostic 3D Talking Heads Generation - **Authors:** Federico Nocentini, Claudio Ferrari, Stefano Berretti - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01415 - **Pdf link:** https://arxiv.org/pdf/2306.01415 - **Abstract** This paper presents a novel approach for generating 3D talking heads from raw audio inputs. Our method grounds on the idea that speech related movements can be comprehensively and efficiently described by the motion of a few control points located on the movable parts of the face, i.e., landmarks. The underlying musculoskeletal structure then allows us to learn how their motion influences the geometrical deformations of the whole face. The proposed method employs two distinct models to this aim: the first one learns to generate the motion of a sparse set of landmarks from the given audio. The second model expands such landmarks motion to a dense motion field, which is utilized to animate a given 3D mesh in neutral state. Additionally, we introduce a novel loss function, named Cosine Loss, which minimizes the angle between the generated motion vectors and the ground truth ones. Using landmarks in 3D talking head generation offers various advantages such as consistency, reliability, and obviating the need for manual-annotation. Our approach is designed to be identity-agnostic, enabling high-quality facial animations for any users without additional data or training. ### SASMU: boost the performance of generalized recognition model using synthetic face dataset - **Authors:** Chia-Chun Chung, Pei-Chun Chang, Yong-Sheng Chen, HaoYuan He, Chinson Yeh - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01449 - **Pdf link:** https://arxiv.org/pdf/2306.01449 - **Abstract** Nowadays, deploying a robust face recognition product becomes easy with the development of face recognition techniques for decades. Not only profile image verification but also the state-of-the-art method can handle the in-the-wild image almost perfectly. However, the concern of privacy issues raise rapidly since mainstream research results are powered by tons of web-crawled data, which faces the privacy invasion issue. The community tries to escape this predicament completely by training the face recognition model with synthetic data but faces severe domain gap issues, which still need to access real images and identity labels to fine-tune the model. In this paper, we propose SASMU, a simple, novel, and effective method for face recognition using a synthetic dataset. Our proposed method consists of spatial data augmentation (SA) and spectrum mixup (SMU). We first analyze the existing synthetic datasets for developing a face recognition system. Then, we reveal that heavy data augmentation is helpful for boosting performance when using synthetic data. By analyzing the previous frequency mixup studies, we proposed a novel method for domain generalization. Extensive experimental results have demonstrated the effectiveness of SASMU, achieving state-of-the-art performance on several common benchmarks, such as LFW, AgeDB-30, CA-LFW, CFP-FP, and CP-LFW. ## Keyword: raw image There is no result
2.0
New submissions for Mon, 5 Jun 23 - ## Keyword: events There is no result ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP There is no result ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### 4DSR-GCN: 4D Video Point Cloud Upsampling using Graph Convolutional Networks - **Authors:** Lorenzo Berlincioni, Stefano Berretti, Marco Bertini, Alberto Del Bimbo - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Multimedia (cs.MM) - **Arxiv link:** https://arxiv.org/abs/2306.01081 - **Pdf link:** https://arxiv.org/pdf/2306.01081 - **Abstract** Time varying sequences of 3D point clouds, or 4D point clouds, are now being acquired at an increasing pace in several applications (e.g., LiDAR in autonomous or assisted driving). In many cases, such volume of data is transmitted, thus requiring that proper compression tools are applied to either reduce the resolution or the bandwidth. In this paper, we propose a new solution for upscaling and restoration of time-varying 3D video point clouds after they have been heavily compressed. In consideration of recent growing relevance of 3D applications, %We focused on a model allowing user-side upscaling and artifact removal for 3D video point clouds, a real-time stream of which would require . Our model consists of a specifically designed Graph Convolutional Network (GCN) that combines Dynamic Edge Convolution and Graph Attention Networks for feature aggregation in a Generative Adversarial setting. By taking inspiration PointNet++, We present a different way to sample dense point clouds with the intent to make these modules work in synergy to provide each node enough features about its neighbourhood in order to later on generate new vertices. Compared to other solutions in the literature that address the same task, our proposed model is capable of obtaining comparable results in terms of quality of the reconstruction, while using a substantially lower number of parameters (about 300KB), making our solution deployable in edge computing devices such as LiDAR. ### Reconstruction Distortion of Learned Image Compression with Imperceptible Perturbations - **Authors:** Yang Sui, Zhuohang Li, Ding Ding, Xiang Pan, Xiaozhong Xu, Shan Liu, Zhenzhong Chen - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2306.01125 - **Pdf link:** https://arxiv.org/pdf/2306.01125 - **Abstract** Learned Image Compression (LIC) has recently become the trending technique for image transmission due to its notable performance. Despite its popularity, the robustness of LIC with respect to the quality of image reconstruction remains under-explored. In this paper, we introduce an imperceptible attack approach designed to effectively degrade the reconstruction quality of LIC, resulting in the reconstructed image being severely disrupted by noise where any object in the reconstructed images is virtually impossible. More specifically, we generate adversarial examples by introducing a Frobenius norm-based loss function to maximize the discrepancy between original images and reconstructed adversarial examples. Further, leveraging the insensitivity of high-frequency components to human vision, we introduce Imperceptibility Constraint (IC) to ensure that the perturbations remain inconspicuous. Experiments conducted on the Kodak dataset using various LIC models demonstrate effectiveness. In addition, we provide several findings and suggestions for designing future defenses. ### DWT-CompCNN: Deep Image Classification Network for High Throughput JPEG 2000 Compressed Documents - **Authors:** Tejasvee Bisen, Mohammed Javed, Shashank Kirtania, P. Nagabhushan - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Information Retrieval (cs.IR); Machine Learning (cs.LG); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2306.01359 - **Pdf link:** https://arxiv.org/pdf/2306.01359 - **Abstract** For any digital application with document images such as retrieval, the classification of document images becomes an essential stage. Conventionally for the purpose, the full versions of the documents, that is the uncompressed document images make the input dataset, which poses a threat due to the big volume required to accommodate the full versions of the documents. Therefore, it would be novel, if the same classification task could be accomplished directly (with some partial decompression) with the compressed representation of documents in order to make the whole process computationally more efficient. In this research work, a novel deep learning model, DWT CompCNN is proposed for classification of documents that are compressed using High Throughput JPEG 2000 (HTJ2K) algorithm. The proposed DWT-CompCNN comprises of five convolutional layers with filter sizes of 16, 32, 64, 128, and 256 consecutively for each increasing layer to improve learning from the wavelet coefficients extracted from the compressed images. Experiments are performed on two benchmark datasets- Tobacco-3482 and RVL-CDIP, which demonstrate that the proposed model is time and space efficient, and also achieves a better classification accuracy in compressed domain. ### Group channel pruning and spatial attention distilling for object detection - **Authors:** Yun Chu, Pu Li, Yong Bai, Zhuhua Hu, Yongqing Chen, Jiafeng Lu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2306.01526 - **Pdf link:** https://arxiv.org/pdf/2306.01526 - **Abstract** Due to the over-parameterization of neural networks, many model compression methods based on pruning and quantization have emerged. They are remarkable in reducing the size, parameter number, and computational complexity of the model. However, most of the models compressed by such methods need the support of special hardware and software, which increases the deployment cost. Moreover, these methods are mainly used in classification tasks, and rarely directly used in detection tasks. To address these issues, for the object detection network we introduce a three-stage model compression method: dynamic sparse training, group channel pruning, and spatial attention distilling. Firstly, to select out the unimportant channels in the network and maintain a good balance between sparsity and accuracy, we put forward a dynamic sparse training method, which introduces a variable sparse rate, and the sparse rate will change with the training process of the network. Secondly, to reduce the effect of pruning on network accuracy, we propose a novel pruning method called group channel pruning. In particular, we divide the network into multiple groups according to the scales of the feature layer and the similarity of module structure in the network, and then we use different pruning thresholds to prune the channels in each group. Finally, to recover the accuracy of the pruned network, we use an improved knowledge distillation method for the pruned network. Especially, we extract spatial attention information from the feature maps of specific scales in each group as knowledge for distillation. In the experiments, we use YOLOv4 as the object detection network and PASCAL VOC as the training dataset. Our method reduces the parameters of the model by 64.7 % and the calculation by 34.9%. ## Keyword: RAW ### Pedestrian Crossing Action Recognition and Trajectory Prediction with 3D Human Keypoints - **Authors:** Jiachen Li, Xinwei Shi, Feiyu Chen, Jonathan Stroud, Zhishuai Zhang, Tian Lan, Junhua Mao, Jeonhyung Kang, Khaled S. Refaat, Weilong Yang, Eugene Ie, Congcong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG); Robotics (cs.RO) - **Arxiv link:** https://arxiv.org/abs/2306.01075 - **Pdf link:** https://arxiv.org/pdf/2306.01075 - **Abstract** Accurate understanding and prediction of human behaviors are critical prerequisites for autonomous vehicles, especially in highly dynamic and interactive scenarios such as intersections in dense urban areas. In this work, we aim at identifying crossing pedestrians and predicting their future trajectories. To achieve these goals, we not only need the context information of road geometry and other traffic participants but also need fine-grained information of the human pose, motion and activity, which can be inferred from human keypoints. In this paper, we propose a novel multi-task learning framework for pedestrian crossing action recognition and trajectory prediction, which utilizes 3D human keypoints extracted from raw sensor data to capture rich information on human pose and activity. Moreover, we propose to apply two auxiliary tasks and contrastive learning to enable auxiliary supervisions to improve the learned keypoints representation, which further enhances the performance of major tasks. We validate our approach on a large-scale in-house dataset, as well as a public benchmark dataset, and show that our approach achieves state-of-the-art performance on a wide range of evaluation metrics. The effectiveness of each model component is validated in a detailed ablation study. ### Quantifying Sample Anonymity in Score-Based Generative Models with Adversarial Fingerprinting - **Authors:** Mischa Dombrowski, Bernhard Kainz - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01363 - **Pdf link:** https://arxiv.org/pdf/2306.01363 - **Abstract** Recent advances in score-based generative models have led to a huge spike in the development of downstream applications using generative models ranging from data augmentation over image and video generation to anomaly detection. Despite publicly available trained models, their potential to be used for privacy preserving data sharing has not been fully explored yet. Training diffusion models on private data and disseminating the models and weights rather than the raw dataset paves the way for innovative large-scale data-sharing strategies, particularly in healthcare, where safeguarding patients' personal health information is paramount. However, publishing such models without individual consent of, e.g., the patients from whom the data was acquired, necessitates guarantees that identifiable training samples will never be reproduced, thus protecting personal health data and satisfying the requirements of policymakers and regulatory bodies. This paper introduces a method for estimating the upper bound of the probability of reproducing identifiable training images during the sampling process. This is achieved by designing an adversarial approach that searches for anatomic fingerprints, such as medical devices or dermal art, which could potentially be employed to re-identify training images. Our method harnesses the learned score-based model to estimate the probability of the entire subspace of the score function that may be utilized for one-to-one reproduction of training samples. To validate our estimates, we generate anomalies containing a fingerprint and investigate whether generated samples from trained generative models can be uniquely mapped to the original training samples. Overall our results show that privacy-breaching images are reproduced at sampling time if the models were trained without care. ### Learning Landmarks Motion from Speech for Speaker-Agnostic 3D Talking Heads Generation - **Authors:** Federico Nocentini, Claudio Ferrari, Stefano Berretti - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01415 - **Pdf link:** https://arxiv.org/pdf/2306.01415 - **Abstract** This paper presents a novel approach for generating 3D talking heads from raw audio inputs. Our method grounds on the idea that speech related movements can be comprehensively and efficiently described by the motion of a few control points located on the movable parts of the face, i.e., landmarks. The underlying musculoskeletal structure then allows us to learn how their motion influences the geometrical deformations of the whole face. The proposed method employs two distinct models to this aim: the first one learns to generate the motion of a sparse set of landmarks from the given audio. The second model expands such landmarks motion to a dense motion field, which is utilized to animate a given 3D mesh in neutral state. Additionally, we introduce a novel loss function, named Cosine Loss, which minimizes the angle between the generated motion vectors and the ground truth ones. Using landmarks in 3D talking head generation offers various advantages such as consistency, reliability, and obviating the need for manual-annotation. Our approach is designed to be identity-agnostic, enabling high-quality facial animations for any users without additional data or training. ### SASMU: boost the performance of generalized recognition model using synthetic face dataset - **Authors:** Chia-Chun Chung, Pei-Chun Chang, Yong-Sheng Chen, HaoYuan He, Chinson Yeh - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2306.01449 - **Pdf link:** https://arxiv.org/pdf/2306.01449 - **Abstract** Nowadays, deploying a robust face recognition product becomes easy with the development of face recognition techniques for decades. Not only profile image verification but also the state-of-the-art method can handle the in-the-wild image almost perfectly. However, the concern of privacy issues raise rapidly since mainstream research results are powered by tons of web-crawled data, which faces the privacy invasion issue. The community tries to escape this predicament completely by training the face recognition model with synthetic data but faces severe domain gap issues, which still need to access real images and identity labels to fine-tune the model. In this paper, we propose SASMU, a simple, novel, and effective method for face recognition using a synthetic dataset. Our proposed method consists of spatial data augmentation (SA) and spectrum mixup (SMU). We first analyze the existing synthetic datasets for developing a face recognition system. Then, we reveal that heavy data augmentation is helpful for boosting performance when using synthetic data. By analyzing the previous frequency mixup studies, we proposed a novel method for domain generalization. Extensive experimental results have demonstrated the effectiveness of SASMU, achieving state-of-the-art performance on several common benchmarks, such as LFW, AgeDB-30, CA-LFW, CFP-FP, and CP-LFW. ## Keyword: raw image There is no result
process
new submissions for mon jun keyword events there is no result keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp there is no result keyword image signal processing there is no result keyword image signal process there is no result keyword compression gcn video point cloud upsampling using graph convolutional networks authors lorenzo berlincioni stefano berretti marco bertini alberto del bimbo subjects computer vision and pattern recognition cs cv artificial intelligence cs ai multimedia cs mm arxiv link pdf link abstract time varying sequences of point clouds or point clouds are now being acquired at an increasing pace in several applications e g lidar in autonomous or assisted driving in many cases such volume of data is transmitted thus requiring that proper compression tools are applied to either reduce the resolution or the bandwidth in this paper we propose a new solution for upscaling and restoration of time varying video point clouds after they have been heavily compressed in consideration of recent growing relevance of applications we focused on a model allowing user side upscaling and artifact removal for video point clouds a real time stream of which would require our model consists of a specifically designed graph convolutional network gcn that combines dynamic edge convolution and graph attention networks for feature aggregation in a generative adversarial setting by taking inspiration pointnet we present a different way to sample dense point clouds with the intent to make these modules work in synergy to provide each node enough features about its neighbourhood in order to later on generate new vertices compared to other solutions in the literature that address the same task our proposed model is capable of obtaining comparable results in terms of quality of the reconstruction while using a substantially lower number of parameters about making our solution deployable in edge computing devices such as lidar reconstruction distortion of learned image compression with imperceptible perturbations authors yang sui zhuohang li ding ding xiang pan xiaozhong xu shan liu zhenzhong chen subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract learned image compression lic has recently become the trending technique for image transmission due to its notable performance despite its popularity the robustness of lic with respect to the quality of image reconstruction remains under explored in this paper we introduce an imperceptible attack approach designed to effectively degrade the reconstruction quality of lic resulting in the reconstructed image being severely disrupted by noise where any object in the reconstructed images is virtually impossible more specifically we generate adversarial examples by introducing a frobenius norm based loss function to maximize the discrepancy between original images and reconstructed adversarial examples further leveraging the insensitivity of high frequency components to human vision we introduce imperceptibility constraint ic to ensure that the perturbations remain inconspicuous experiments conducted on the kodak dataset using various lic models demonstrate effectiveness in addition we provide several findings and suggestions for designing future defenses dwt compcnn deep image classification network for high throughput jpeg compressed documents authors tejasvee bisen mohammed javed shashank kirtania p nagabhushan subjects computer vision and pattern recognition cs cv information retrieval cs ir machine learning cs lg image and video processing eess iv arxiv link pdf link abstract for any digital application with document images such as retrieval the classification of document images becomes an essential stage conventionally for the purpose the full versions of the documents that is the uncompressed document images make the input dataset which poses a threat due to the big volume required to accommodate the full versions of the documents therefore it would be novel if the same classification task could be accomplished directly with some partial decompression with the compressed representation of documents in order to make the whole process computationally more efficient in this research work a novel deep learning model dwt compcnn is proposed for classification of documents that are compressed using high throughput jpeg algorithm the proposed dwt compcnn comprises of five convolutional layers with filter sizes of and consecutively for each increasing layer to improve learning from the wavelet coefficients extracted from the compressed images experiments are performed on two benchmark datasets tobacco and rvl cdip which demonstrate that the proposed model is time and space efficient and also achieves a better classification accuracy in compressed domain group channel pruning and spatial attention distilling for object detection authors yun chu pu li yong bai zhuhua hu yongqing chen jiafeng lu subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract due to the over parameterization of neural networks many model compression methods based on pruning and quantization have emerged they are remarkable in reducing the size parameter number and computational complexity of the model however most of the models compressed by such methods need the support of special hardware and software which increases the deployment cost moreover these methods are mainly used in classification tasks and rarely directly used in detection tasks to address these issues for the object detection network we introduce a three stage model compression method dynamic sparse training group channel pruning and spatial attention distilling firstly to select out the unimportant channels in the network and maintain a good balance between sparsity and accuracy we put forward a dynamic sparse training method which introduces a variable sparse rate and the sparse rate will change with the training process of the network secondly to reduce the effect of pruning on network accuracy we propose a novel pruning method called group channel pruning in particular we divide the network into multiple groups according to the scales of the feature layer and the similarity of module structure in the network and then we use different pruning thresholds to prune the channels in each group finally to recover the accuracy of the pruned network we use an improved knowledge distillation method for the pruned network especially we extract spatial attention information from the feature maps of specific scales in each group as knowledge for distillation in the experiments we use as the object detection network and pascal voc as the training dataset our method reduces the parameters of the model by and the calculation by keyword raw pedestrian crossing action recognition and trajectory prediction with human keypoints authors jiachen li xinwei shi feiyu chen jonathan stroud zhishuai zhang tian lan junhua mao jeonhyung kang khaled s refaat weilong yang eugene ie congcong li subjects computer vision and pattern recognition cs cv artificial intelligence cs ai machine learning cs lg robotics cs ro arxiv link pdf link abstract accurate understanding and prediction of human behaviors are critical prerequisites for autonomous vehicles especially in highly dynamic and interactive scenarios such as intersections in dense urban areas in this work we aim at identifying crossing pedestrians and predicting their future trajectories to achieve these goals we not only need the context information of road geometry and other traffic participants but also need fine grained information of the human pose motion and activity which can be inferred from human keypoints in this paper we propose a novel multi task learning framework for pedestrian crossing action recognition and trajectory prediction which utilizes human keypoints extracted from raw sensor data to capture rich information on human pose and activity moreover we propose to apply two auxiliary tasks and contrastive learning to enable auxiliary supervisions to improve the learned keypoints representation which further enhances the performance of major tasks we validate our approach on a large scale in house dataset as well as a public benchmark dataset and show that our approach achieves state of the art performance on a wide range of evaluation metrics the effectiveness of each model component is validated in a detailed ablation study quantifying sample anonymity in score based generative models with adversarial fingerprinting authors mischa dombrowski bernhard kainz subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recent advances in score based generative models have led to a huge spike in the development of downstream applications using generative models ranging from data augmentation over image and video generation to anomaly detection despite publicly available trained models their potential to be used for privacy preserving data sharing has not been fully explored yet training diffusion models on private data and disseminating the models and weights rather than the raw dataset paves the way for innovative large scale data sharing strategies particularly in healthcare where safeguarding patients personal health information is paramount however publishing such models without individual consent of e g the patients from whom the data was acquired necessitates guarantees that identifiable training samples will never be reproduced thus protecting personal health data and satisfying the requirements of policymakers and regulatory bodies this paper introduces a method for estimating the upper bound of the probability of reproducing identifiable training images during the sampling process this is achieved by designing an adversarial approach that searches for anatomic fingerprints such as medical devices or dermal art which could potentially be employed to re identify training images our method harnesses the learned score based model to estimate the probability of the entire subspace of the score function that may be utilized for one to one reproduction of training samples to validate our estimates we generate anomalies containing a fingerprint and investigate whether generated samples from trained generative models can be uniquely mapped to the original training samples overall our results show that privacy breaching images are reproduced at sampling time if the models were trained without care learning landmarks motion from speech for speaker agnostic talking heads generation authors federico nocentini claudio ferrari stefano berretti subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this paper presents a novel approach for generating talking heads from raw audio inputs our method grounds on the idea that speech related movements can be comprehensively and efficiently described by the motion of a few control points located on the movable parts of the face i e landmarks the underlying musculoskeletal structure then allows us to learn how their motion influences the geometrical deformations of the whole face the proposed method employs two distinct models to this aim the first one learns to generate the motion of a sparse set of landmarks from the given audio the second model expands such landmarks motion to a dense motion field which is utilized to animate a given mesh in neutral state additionally we introduce a novel loss function named cosine loss which minimizes the angle between the generated motion vectors and the ground truth ones using landmarks in talking head generation offers various advantages such as consistency reliability and obviating the need for manual annotation our approach is designed to be identity agnostic enabling high quality facial animations for any users without additional data or training sasmu boost the performance of generalized recognition model using synthetic face dataset authors chia chun chung pei chun chang yong sheng chen haoyuan he chinson yeh subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract nowadays deploying a robust face recognition product becomes easy with the development of face recognition techniques for decades not only profile image verification but also the state of the art method can handle the in the wild image almost perfectly however the concern of privacy issues raise rapidly since mainstream research results are powered by tons of web crawled data which faces the privacy invasion issue the community tries to escape this predicament completely by training the face recognition model with synthetic data but faces severe domain gap issues which still need to access real images and identity labels to fine tune the model in this paper we propose sasmu a simple novel and effective method for face recognition using a synthetic dataset our proposed method consists of spatial data augmentation sa and spectrum mixup smu we first analyze the existing synthetic datasets for developing a face recognition system then we reveal that heavy data augmentation is helpful for boosting performance when using synthetic data by analyzing the previous frequency mixup studies we proposed a novel method for domain generalization extensive experimental results have demonstrated the effectiveness of sasmu achieving state of the art performance on several common benchmarks such as lfw agedb ca lfw cfp fp and cp lfw keyword raw image there is no result
1
4,363
7,260,514,426
IssuesEvent
2018-02-18 10:53:32
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[FEATURE][processing] Extract nodes algorithm now saves node index, distance along line and angle at node
Automatic new feature Processing
Original commit: https://github.com/qgis/QGIS/commit/9fa4e776dbf2ba5004583be1bcb0e8eb8c343301 by nyalldawson Also correctly handles null geometries
1.0
[FEATURE][processing] Extract nodes algorithm now saves node index, distance along line and angle at node - Original commit: https://github.com/qgis/QGIS/commit/9fa4e776dbf2ba5004583be1bcb0e8eb8c343301 by nyalldawson Also correctly handles null geometries
process
extract nodes algorithm now saves node index distance along line and angle at node original commit by nyalldawson also correctly handles null geometries
1
693,078
23,762,230,378
IssuesEvent
2022-09-01 09:48:52
Toma400/The_Isle_of_Ansur
https://api.github.com/repos/Toma400/The_Isle_of_Ansur
opened
Visual HUD
feature request high priority bulk topic
This topic is to organise all changes that should come with visual HUD, be it in upcoming pre-alpha 3 update or later. Most important: - Graphical menu, settings and character creation (#16 connected) - Code changes allowing for adding assets to modules (so they can be used in character creation, both in choosing class/race, as well as for avatar presets) Big priority: - Characters will be able to have avatar, either chosen from presets or custom one - Music addition* *not graphical, but switching from terminal adds also huge possibilities for music addition
1.0
Visual HUD - This topic is to organise all changes that should come with visual HUD, be it in upcoming pre-alpha 3 update or later. Most important: - Graphical menu, settings and character creation (#16 connected) - Code changes allowing for adding assets to modules (so they can be used in character creation, both in choosing class/race, as well as for avatar presets) Big priority: - Characters will be able to have avatar, either chosen from presets or custom one - Music addition* *not graphical, but switching from terminal adds also huge possibilities for music addition
non_process
visual hud this topic is to organise all changes that should come with visual hud be it in upcoming pre alpha update or later most important graphical menu settings and character creation connected code changes allowing for adding assets to modules so they can be used in character creation both in choosing class race as well as for avatar presets big priority characters will be able to have avatar either chosen from presets or custom one music addition not graphical but switching from terminal adds also huge possibilities for music addition
0
80,308
15,586,276,160
IssuesEvent
2021-03-18 01:34:18
venkateshreddypala/aircraft
https://api.github.com/repos/venkateshreddypala/aircraft
opened
CVE-2021-25329 (High) detected in tomcat-embed-core-8.5.28.jar
security vulnerability
## CVE-2021-25329 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.28.jar</b></p></summary> <p>Core Tomcat implementation</p> <p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p> <p>Path to dependency file: /aircraft/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.28/tomcat-embed-core-8.5.28.jar</p> <p> Dependency Hierarchy: - spring-cloud-config-server-2.0.0.M9.jar (Root Library) - spring-boot-starter-web-2.0.0.RELEASE.jar - spring-boot-starter-tomcat-2.0.0.RELEASE.jar - :x: **tomcat-embed-core-8.5.28.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The fix for CVE-2020-9484 was incomplete. When using Apache Tomcat 10.0.0-M1 to 10.0.0, 9.0.0.M1 to 9.0.41, 8.5.0 to 8.5.61 or 7.0.0. to 7.0.107 with a configuration edge case that was highly unlikely to be used, the Tomcat instance was still vulnerable to CVE-2020-9494. Note that both the previously published prerequisites for CVE-2020-9484 and the previously published mitigations for CVE-2020-9484 also apply to this issue. <p>Publish Date: 2021-03-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-25329>CVE-2021-25329</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://lists.apache.org/thread.html/rfe62fbf9d4c314f166fe8c668e50e5d9dd882a99447f26f0367474bf%40%3Cannounce.tomcat.apache.org%3E">https://lists.apache.org/thread.html/rfe62fbf9d4c314f166fe8c668e50e5d9dd882a99447f26f0367474bf%40%3Cannounce.tomcat.apache.org%3E</a></p> <p>Release Date: 2021-03-01</p> <p>Fix Resolution: org.apache.tomcat:tomcat:7.0.108, org.apache.tomcat:tomcat:8.5.63, org.apache.tomcat:tomcat:9.0.43,org.apache.tomcat:tomcat:10.0.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-25329 (High) detected in tomcat-embed-core-8.5.28.jar - ## CVE-2021-25329 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.28.jar</b></p></summary> <p>Core Tomcat implementation</p> <p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p> <p>Path to dependency file: /aircraft/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.28/tomcat-embed-core-8.5.28.jar</p> <p> Dependency Hierarchy: - spring-cloud-config-server-2.0.0.M9.jar (Root Library) - spring-boot-starter-web-2.0.0.RELEASE.jar - spring-boot-starter-tomcat-2.0.0.RELEASE.jar - :x: **tomcat-embed-core-8.5.28.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The fix for CVE-2020-9484 was incomplete. When using Apache Tomcat 10.0.0-M1 to 10.0.0, 9.0.0.M1 to 9.0.41, 8.5.0 to 8.5.61 or 7.0.0. to 7.0.107 with a configuration edge case that was highly unlikely to be used, the Tomcat instance was still vulnerable to CVE-2020-9494. Note that both the previously published prerequisites for CVE-2020-9484 and the previously published mitigations for CVE-2020-9484 also apply to this issue. <p>Publish Date: 2021-03-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-25329>CVE-2021-25329</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://lists.apache.org/thread.html/rfe62fbf9d4c314f166fe8c668e50e5d9dd882a99447f26f0367474bf%40%3Cannounce.tomcat.apache.org%3E">https://lists.apache.org/thread.html/rfe62fbf9d4c314f166fe8c668e50e5d9dd882a99447f26f0367474bf%40%3Cannounce.tomcat.apache.org%3E</a></p> <p>Release Date: 2021-03-01</p> <p>Fix Resolution: org.apache.tomcat:tomcat:7.0.108, org.apache.tomcat:tomcat:8.5.63, org.apache.tomcat:tomcat:9.0.43,org.apache.tomcat:tomcat:10.0.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in tomcat embed core jar cve high severity vulnerability vulnerable library tomcat embed core jar core tomcat implementation library home page a href path to dependency file aircraft pom xml path to vulnerable library root repository org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring cloud config server jar root library spring boot starter web release jar spring boot starter tomcat release jar x tomcat embed core jar vulnerable library vulnerability details the fix for cve was incomplete when using apache tomcat to to to or to with a configuration edge case that was highly unlikely to be used the tomcat instance was still vulnerable to cve note that both the previously published prerequisites for cve and the previously published mitigations for cve also apply to this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache tomcat tomcat org apache tomcat tomcat org apache tomcat tomcat org apache tomcat tomcat step up your open source security game with whitesource
0
10,052
13,044,161,666
IssuesEvent
2020-07-29 03:47:25
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `SubDateIntReal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `SubDateIntReal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `SubDateIntReal` from TiDB - ## Description Port the scalar function `SubDateIntReal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function subdateintreal from tidb description port the scalar function subdateintreal from tidb to coprocessor score mentor s breeswish recommended skills rust programming learning materials already implemented expressions ported from tidb
1
515,292
14,959,172,687
IssuesEvent
2021-01-27 02:32:01
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
opened
intel_adsp_cavs15: Cannot download firmware of kernel testcases
bug priority: high
**Describe the bug** Use west to flash zephyr image on ADSP failed **To Reproduce** Steps to reproduce the behavior: 1. west build -b intel_adsp_cavs15 tests/kernel/thread 2. west sign -t rimage -p ../modules/audio/sof/zephyr/ext/rimage/build/rimage -D ../modules/audio/sof/zephyr/ext/rimage/config/ -- -k ../modules/audio/sof/keys/otc_private_key.pem 3. /home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py -f /home/ztest/work/zephyrproject/zephyr/build/zephyr/zephyr.ri **Expected behavior** Firmware load successful **Logs and console output** Start firmware downloading... Open HDA device: /dev/hda Reset DSP... Firmware Status Register (0xFFFFFFFF) Boot: 0xFFFFFF (UNKNOWN) Wait: 0x0F (UNKNOWN) Module: 0x07 Error: 0x01 IPC CMD : 0xFFFFFFFF IPC LEN : 0xFFFFFFFF Booting up DSP... Firmware Status Register (0x05000001) Boot: 0x000001 (INIT_DONE) Wait: 0x05 (DMA_BUFFER_FULL) Module: 0x00 Error: 0x00 Downloading firmware... Traceback (most recent call last): File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py", line 81, in <module> main() File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py", line 63, in main fw_loader.download_firmware(args.firmware) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/loader.py", line 187, in download_firmware sd = self.load_firmware(fw_file) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/loader.py", line 174, in load_firmware sd.buf.copy(data, len(data)) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/stream_desc.py", line 41, in copy self.buf[:] = data[:] **ValueError: Can only assign sequence of same size** **Environment** (please complete the following information): OS: Fedora28 Toolchain: Zephyr-sdk-0.12.1 Commit ID: fc03bd2b863d543bd9
1.0
intel_adsp_cavs15: Cannot download firmware of kernel testcases - **Describe the bug** Use west to flash zephyr image on ADSP failed **To Reproduce** Steps to reproduce the behavior: 1. west build -b intel_adsp_cavs15 tests/kernel/thread 2. west sign -t rimage -p ../modules/audio/sof/zephyr/ext/rimage/build/rimage -D ../modules/audio/sof/zephyr/ext/rimage/config/ -- -k ../modules/audio/sof/keys/otc_private_key.pem 3. /home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py -f /home/ztest/work/zephyrproject/zephyr/build/zephyr/zephyr.ri **Expected behavior** Firmware load successful **Logs and console output** Start firmware downloading... Open HDA device: /dev/hda Reset DSP... Firmware Status Register (0xFFFFFFFF) Boot: 0xFFFFFF (UNKNOWN) Wait: 0x0F (UNKNOWN) Module: 0x07 Error: 0x01 IPC CMD : 0xFFFFFFFF IPC LEN : 0xFFFFFFFF Booting up DSP... Firmware Status Register (0x05000001) Boot: 0x000001 (INIT_DONE) Wait: 0x05 (DMA_BUFFER_FULL) Module: 0x00 Error: 0x00 Downloading firmware... Traceback (most recent call last): File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py", line 81, in <module> main() File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/fw_loader.py", line 63, in main fw_loader.download_firmware(args.firmware) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/loader.py", line 187, in download_firmware sd = self.load_firmware(fw_file) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/loader.py", line 174, in load_firmware sd.buf.copy(data, len(data)) File "/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/lib/stream_desc.py", line 41, in copy self.buf[:] = data[:] **ValueError: Can only assign sequence of same size** **Environment** (please complete the following information): OS: Fedora28 Toolchain: Zephyr-sdk-0.12.1 Commit ID: fc03bd2b863d543bd9
non_process
intel adsp cannot download firmware of kernel testcases describe the bug use west to flash zephyr image on adsp failed to reproduce steps to reproduce the behavior west build b intel adsp tests kernel thread west sign t rimage p modules audio sof zephyr ext rimage build rimage d modules audio sof zephyr ext rimage config k modules audio sof keys otc private key pem home ztest work zephyrproject zephyr boards xtensa intel adsp tools fw loader py f home ztest work zephyrproject zephyr build zephyr zephyr ri expected behavior firmware load successful logs and console output start firmware downloading open hda device dev hda reset dsp firmware status register boot unknown wait unknown module error ipc cmd ipc len booting up dsp firmware status register boot init done wait dma buffer full module error downloading firmware traceback most recent call last file home ztest work zephyrproject zephyr boards xtensa intel adsp tools fw loader py line in main file home ztest work zephyrproject zephyr boards xtensa intel adsp tools fw loader py line in main fw loader download firmware args firmware file home ztest work zephyrproject zephyr boards xtensa intel adsp tools lib loader py line in download firmware sd self load firmware fw file file home ztest work zephyrproject zephyr boards xtensa intel adsp tools lib loader py line in load firmware sd buf copy data len data file home ztest work zephyrproject zephyr boards xtensa intel adsp tools lib stream desc py line in copy self buf data valueerror can only assign sequence of same size environment please complete the following information os toolchain zephyr sdk commit id
0
814
3,288,421,409
IssuesEvent
2015-10-29 15:07:15
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Branch filtering and keyrefs [DOT 2.x]
bug preprocess/filtering won't fix
If in the DITA Map has in it a construct like: <topicref keyref="topicKey"> <ditavalref href="filter.ditaval"/> </topicref> the DITAVAL filter will not get applied on the referenced topic. From what I've seen the branch filtering stage seems to be applied before resolving keys and this might be the reason for it. If you want and if it is not a known issue I can try to provide a complete sample for this.
1.0
Branch filtering and keyrefs [DOT 2.x] - If in the DITA Map has in it a construct like: <topicref keyref="topicKey"> <ditavalref href="filter.ditaval"/> </topicref> the DITAVAL filter will not get applied on the referenced topic. From what I've seen the branch filtering stage seems to be applied before resolving keys and this might be the reason for it. If you want and if it is not a known issue I can try to provide a complete sample for this.
process
branch filtering and keyrefs if in the dita map has in it a construct like the ditaval filter will not get applied on the referenced topic from what i ve seen the branch filtering stage seems to be applied before resolving keys and this might be the reason for it if you want and if it is not a known issue i can try to provide a complete sample for this
1
16,708
21,868,742,673
IssuesEvent
2022-05-19 02:18:22
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
converttocurves algorithm crashes if fed already curved polygons
Processing Bug Crash/Data Corruption
### What is the bug or the crash? The native:converttocurves algorithm crashes if fed already curved polygons ### Steps to reproduce the issue 1. Create a curved feature on a curved polygon. 2. Run the convert to curves algorithm 3. See the crash Here's a minimal script that does exatly that ``` WKT = 'CurvePolygon (CompoundCurve (CircularString (2613627 1178798, 2613639 1178805, 2613648 1178794),(2613648 1178794, 2613627 1178798)))' # Create a feature f = QgsFeature() f.setGeometry(QgsGeometry.fromWkt(WKT)) # Add it to a memory layer vl = QgsVectorLayer("CurvePolygon?crs=epsg:2056", "temp", "memory") vl.dataProvider().addFeature(f) # Run the algorithm processing.runAndLoadResults( "native:converttocurves", {"INPUT": vl, "DISTANCE": 1e-6, "ANGLE": 1e-6, "OUTPUT": "memory:output"} ) ``` ### Versions Happens on both 3.22.5 and 3.24.1 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context Stack trace (3.22) ``` Crash ID: [ffc5f2890057b6ef24820e31ef351f62da962daa](https://github.com/qgis/QGIS/search?q=ffc5f2890057b6ef24820e31ef351f62da962daa&type=Issues) Stack Trace QgsLineString::pointN qgslinestring.cpp:974 lineToCurve qgsinternalgeometryengine.cpp:1399 convertGeometryToCurves qgsinternalgeometryengine.cpp:1531 QgsInternalGeometryEngine::convertToCurves qgsinternalgeometryengine.cpp:1575 QgsGeometry::convertToCurves qgsgeometry.cpp:2250 QgsConvertToCurvesAlgorithm::processFeature qgsalgorithmconverttocurves.cpp:115 QgsProcessingFeatureBasedAlgorithm::processAlgorithm qgsprocessingalgorithm.cpp:1039 QgsProcessingAlgorithm::runPrepared qgsprocessingalgorithm.cpp:533 QgsProcessingAlgorithm::run qgsprocessingalgorithm.cpp:462 PyInit__core : PyArg_ParseTuple_SizeT : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyEval_EvalCodeWithName : PyEval_EvalCodeEx : PyEval_EvalCode : PyEval_GetBuiltins : PyEval_GetBuiltins : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFunction_Vectorcall : PyFloat_FromDouble : PyVectorcall_Call : PyObject_Call : PyInit__gui : QWidget::event : QFrame::event : QAbstractScrollArea::event : PyInit__gui : QApplicationPrivate::notify_helper : QApplication::notify : QgsApplication::notify qgsapplication.cpp:514 QCoreApplication::notifyInternal2 : QSizePolicy::QSizePolicy : QApplicationPrivate::notify_helper : QApplication::notify : QgsApplication::notify qgsapplication.cpp:514 QCoreApplication::notifyInternal2 : QGuiApplicationPrivate::processKeyEvent : QWindowSystemInterface::sendWindowSystemEvents : QEventDispatcherWin32::processEvents : qt_plugin_query_metadata : QEventLoop::exec : QCoreApplication::exec : main main.cpp:1667 BaseThreadInitThunk : RtlUserThreadStart : QGIS Info QGIS Version: 3.22.5-Bia?owie?a QGIS code revision: c27231782f Compiled against Qt: 5.15.2 Running against Qt: 5.15.2 Compiled against GDAL: 3.4.2 Running against GDAL: 3.4.2 System Info CPU Type: x86_64 Kernel Type: winnt Kernel Version: 10.0.19043 ``` Stack trace (3.24) ``` Python Stack Trace Windows fatal exception: access violation Current thread 0x00003f48 (most recent call first): File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\gui\AlgorithmExecutor.py", line 72 in execute results, ok = alg.run(parameters, context, feedback, {}, False) File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\core\Processing.py", line 187 in runAlgorithm ret, results = execute(alg, parameters, context, feedback, catch_exceptions=False) File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\tools\general.py", line 151 in runAndLoadResults return Processing.runAlgorithm(alg, parameters=parameters, onFinish=handleAlgorithmResults, feedback=feedback, File "C:\OSGeo4W\apps\Python39\lib\code.py", line 90 in runcode exec(code, self.locals) File "C:\OSGeo4W\apps\Python39\lib\code.py", line 74 in runsource self.runcode(code) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 559 in runsource return super(ShellScintilla, self).runsource(source, filename, symbol) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 526 in runCommand more = self.runsource(src) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 506 in entered self.runCommand(self.text()) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 344 in keyPressEvent self.entered() Stack Trace No stack trace is available. QGIS Info QGIS Version: 3.24.1-Tisler QGIS code revision: 5709b82461 Compiled against Qt: 5.15.2 Running against Qt: 5.15.2 Compiled against GDAL: 3.4.2 Running against GDAL: 3.4.2 System Info CPU Type: x86_64 Kernel Type: winnt Kernel Version: 10.0.19043 ```
1.0
converttocurves algorithm crashes if fed already curved polygons - ### What is the bug or the crash? The native:converttocurves algorithm crashes if fed already curved polygons ### Steps to reproduce the issue 1. Create a curved feature on a curved polygon. 2. Run the convert to curves algorithm 3. See the crash Here's a minimal script that does exatly that ``` WKT = 'CurvePolygon (CompoundCurve (CircularString (2613627 1178798, 2613639 1178805, 2613648 1178794),(2613648 1178794, 2613627 1178798)))' # Create a feature f = QgsFeature() f.setGeometry(QgsGeometry.fromWkt(WKT)) # Add it to a memory layer vl = QgsVectorLayer("CurvePolygon?crs=epsg:2056", "temp", "memory") vl.dataProvider().addFeature(f) # Run the algorithm processing.runAndLoadResults( "native:converttocurves", {"INPUT": vl, "DISTANCE": 1e-6, "ANGLE": 1e-6, "OUTPUT": "memory:output"} ) ``` ### Versions Happens on both 3.22.5 and 3.24.1 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context Stack trace (3.22) ``` Crash ID: [ffc5f2890057b6ef24820e31ef351f62da962daa](https://github.com/qgis/QGIS/search?q=ffc5f2890057b6ef24820e31ef351f62da962daa&type=Issues) Stack Trace QgsLineString::pointN qgslinestring.cpp:974 lineToCurve qgsinternalgeometryengine.cpp:1399 convertGeometryToCurves qgsinternalgeometryengine.cpp:1531 QgsInternalGeometryEngine::convertToCurves qgsinternalgeometryengine.cpp:1575 QgsGeometry::convertToCurves qgsgeometry.cpp:2250 QgsConvertToCurvesAlgorithm::processFeature qgsalgorithmconverttocurves.cpp:115 QgsProcessingFeatureBasedAlgorithm::processAlgorithm qgsprocessingalgorithm.cpp:1039 QgsProcessingAlgorithm::runPrepared qgsprocessingalgorithm.cpp:533 QgsProcessingAlgorithm::run qgsprocessingalgorithm.cpp:462 PyInit__core : PyArg_ParseTuple_SizeT : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFunction_Vectorcall : PyEval_EvalFrameDefault : PyObject_GC_Del : PyEval_EvalCodeWithName : PyEval_EvalCodeEx : PyEval_EvalCode : PyEval_GetBuiltins : PyEval_GetBuiltins : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyObject_GC_Del : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFloat_FromDouble : PyEval_EvalFrameDefault : PyFunction_Vectorcall : PyFloat_FromDouble : PyVectorcall_Call : PyObject_Call : PyInit__gui : QWidget::event : QFrame::event : QAbstractScrollArea::event : PyInit__gui : QApplicationPrivate::notify_helper : QApplication::notify : QgsApplication::notify qgsapplication.cpp:514 QCoreApplication::notifyInternal2 : QSizePolicy::QSizePolicy : QApplicationPrivate::notify_helper : QApplication::notify : QgsApplication::notify qgsapplication.cpp:514 QCoreApplication::notifyInternal2 : QGuiApplicationPrivate::processKeyEvent : QWindowSystemInterface::sendWindowSystemEvents : QEventDispatcherWin32::processEvents : qt_plugin_query_metadata : QEventLoop::exec : QCoreApplication::exec : main main.cpp:1667 BaseThreadInitThunk : RtlUserThreadStart : QGIS Info QGIS Version: 3.22.5-Bia?owie?a QGIS code revision: c27231782f Compiled against Qt: 5.15.2 Running against Qt: 5.15.2 Compiled against GDAL: 3.4.2 Running against GDAL: 3.4.2 System Info CPU Type: x86_64 Kernel Type: winnt Kernel Version: 10.0.19043 ``` Stack trace (3.24) ``` Python Stack Trace Windows fatal exception: access violation Current thread 0x00003f48 (most recent call first): File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\gui\AlgorithmExecutor.py", line 72 in execute results, ok = alg.run(parameters, context, feedback, {}, False) File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\core\Processing.py", line 187 in runAlgorithm ret, results = execute(alg, parameters, context, feedback, catch_exceptions=False) File "C:\OSGeo4W/apps/qgis/./python/plugins\processing\tools\general.py", line 151 in runAndLoadResults return Processing.runAlgorithm(alg, parameters=parameters, onFinish=handleAlgorithmResults, feedback=feedback, File "C:\OSGeo4W\apps\Python39\lib\code.py", line 90 in runcode exec(code, self.locals) File "C:\OSGeo4W\apps\Python39\lib\code.py", line 74 in runsource self.runcode(code) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 559 in runsource return super(ShellScintilla, self).runsource(source, filename, symbol) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 526 in runCommand more = self.runsource(src) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 506 in entered self.runCommand(self.text()) File "C:\OSGeo4W/apps/qgis/./python\console\console_sci.py", line 344 in keyPressEvent self.entered() Stack Trace No stack trace is available. QGIS Info QGIS Version: 3.24.1-Tisler QGIS code revision: 5709b82461 Compiled against Qt: 5.15.2 Running against Qt: 5.15.2 Compiled against GDAL: 3.4.2 Running against GDAL: 3.4.2 System Info CPU Type: x86_64 Kernel Type: winnt Kernel Version: 10.0.19043 ```
process
converttocurves algorithm crashes if fed already curved polygons what is the bug or the crash the native converttocurves algorithm crashes if fed already curved polygons steps to reproduce the issue create a curved feature on a curved polygon run the convert to curves algorithm see the crash here s a minimal script that does exatly that wkt curvepolygon compoundcurve circularstring create a feature f qgsfeature f setgeometry qgsgeometry fromwkt wkt add it to a memory layer vl qgsvectorlayer curvepolygon crs epsg temp memory vl dataprovider addfeature f run the algorithm processing runandloadresults native converttocurves input vl distance angle output memory output versions happens on both and supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context stack trace crash id stack trace qgslinestring pointn qgslinestring cpp linetocurve qgsinternalgeometryengine cpp convertgeometrytocurves qgsinternalgeometryengine cpp qgsinternalgeometryengine converttocurves qgsinternalgeometryengine cpp qgsgeometry converttocurves qgsgeometry cpp qgsconverttocurvesalgorithm processfeature qgsalgorithmconverttocurves cpp qgsprocessingfeaturebasedalgorithm processalgorithm qgsprocessingalgorithm cpp qgsprocessingalgorithm runprepared qgsprocessingalgorithm cpp qgsprocessingalgorithm run qgsprocessingalgorithm cpp pyinit core pyarg parsetuple sizet pyeval evalframedefault pyobject gc del pyfunction vectorcall pyeval evalframedefault pyobject gc del pyfunction vectorcall pyeval evalframedefault pyobject gc del pyfunction vectorcall pyeval evalframedefault pyobject gc del pyeval evalcodewithname pyeval evalcodeex pyeval evalcode pyeval getbuiltins pyeval getbuiltins pyeval evalframedefault pyfloat fromdouble pyeval evalframedefault pyobject gc del pyfloat fromdouble pyeval evalframedefault pyobject gc del pyfloat fromdouble pyeval evalframedefault pyfloat fromdouble pyeval evalframedefault pyfloat fromdouble pyeval evalframedefault pyfunction vectorcall pyfloat fromdouble pyvectorcall call pyobject call pyinit gui qwidget event qframe event qabstractscrollarea event pyinit gui qapplicationprivate notify helper qapplication notify qgsapplication notify qgsapplication cpp qcoreapplication qsizepolicy qsizepolicy qapplicationprivate notify helper qapplication notify qgsapplication notify qgsapplication cpp qcoreapplication qguiapplicationprivate processkeyevent qwindowsysteminterface sendwindowsystemevents processevents qt plugin query metadata qeventloop exec qcoreapplication exec main main cpp basethreadinitthunk rtluserthreadstart qgis info qgis version bia owie a qgis code revision compiled against qt running against qt compiled against gdal running against gdal system info cpu type kernel type winnt kernel version stack trace python stack trace windows fatal exception access violation current thread most recent call first file c apps qgis python plugins processing gui algorithmexecutor py line in execute results ok alg run parameters context feedback false file c apps qgis python plugins processing core processing py line in runalgorithm ret results execute alg parameters context feedback catch exceptions false file c apps qgis python plugins processing tools general py line in runandloadresults return processing runalgorithm alg parameters parameters onfinish handlealgorithmresults feedback feedback file c apps lib code py line in runcode exec code self locals file c apps lib code py line in runsource self runcode code file c apps qgis python console console sci py line in runsource return super shellscintilla self runsource source filename symbol file c apps qgis python console console sci py line in runcommand more self runsource src file c apps qgis python console console sci py line in entered self runcommand self text file c apps qgis python console console sci py line in keypressevent self entered stack trace no stack trace is available qgis info qgis version tisler qgis code revision compiled against qt running against qt compiled against gdal running against gdal system info cpu type kernel type winnt kernel version
1
100,999
16,490,736,222
IssuesEvent
2021-05-25 03:10:09
valdisiljuconoks/AlloyTech
https://api.github.com/repos/valdisiljuconoks/AlloyTech
opened
CVE-2019-13173 (High) detected in fstream-0.1.31.tgz
security vulnerability
## CVE-2019-13173 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>fstream-0.1.31.tgz</b></p></summary> <p>Advanced file system stream things</p> <p>Library home page: <a href="https://registry.npmjs.org/fstream/-/fstream-0.1.31.tgz">https://registry.npmjs.org/fstream/-/fstream-0.1.31.tgz</a></p> <p>Path to dependency file: AlloyTech/AlloyTechEpi10/modules/_protected/Shell/Shell/10.1.0.0/ClientResources/lib/xstyle/package.json</p> <p>Path to vulnerable library: AlloyTech/AlloyTechEpi10/modules/_protected/Shell/Shell/10.1.0.0/ClientResources/lib/xstyle/node_modules/fstream/package.json</p> <p> Dependency Hierarchy: - intern-geezer-2.2.3.tgz (Root Library) - digdug-1.4.0.tgz - decompress-0.2.3.tgz - tar-0.1.20.tgz - :x: **fstream-0.1.31.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> fstream before 1.0.12 is vulnerable to Arbitrary File Overwrite. Extracting tarballs containing a hardlink to a file that already exists in the system, and a file that matches the hardlink, will overwrite the system's file with the contents of the extracted file. The fstream.DirWriter() function is vulnerable. <p>Publish Date: 2019-07-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-13173>CVE-2019-13173</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13173">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13173</a></p> <p>Release Date: 2019-07-02</p> <p>Fix Resolution: 1.0.12</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-13173 (High) detected in fstream-0.1.31.tgz - ## CVE-2019-13173 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>fstream-0.1.31.tgz</b></p></summary> <p>Advanced file system stream things</p> <p>Library home page: <a href="https://registry.npmjs.org/fstream/-/fstream-0.1.31.tgz">https://registry.npmjs.org/fstream/-/fstream-0.1.31.tgz</a></p> <p>Path to dependency file: AlloyTech/AlloyTechEpi10/modules/_protected/Shell/Shell/10.1.0.0/ClientResources/lib/xstyle/package.json</p> <p>Path to vulnerable library: AlloyTech/AlloyTechEpi10/modules/_protected/Shell/Shell/10.1.0.0/ClientResources/lib/xstyle/node_modules/fstream/package.json</p> <p> Dependency Hierarchy: - intern-geezer-2.2.3.tgz (Root Library) - digdug-1.4.0.tgz - decompress-0.2.3.tgz - tar-0.1.20.tgz - :x: **fstream-0.1.31.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> fstream before 1.0.12 is vulnerable to Arbitrary File Overwrite. Extracting tarballs containing a hardlink to a file that already exists in the system, and a file that matches the hardlink, will overwrite the system's file with the contents of the extracted file. The fstream.DirWriter() function is vulnerable. <p>Publish Date: 2019-07-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-13173>CVE-2019-13173</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13173">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13173</a></p> <p>Release Date: 2019-07-02</p> <p>Fix Resolution: 1.0.12</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in fstream tgz cve high severity vulnerability vulnerable library fstream tgz advanced file system stream things library home page a href path to dependency file alloytech modules protected shell shell clientresources lib xstyle package json path to vulnerable library alloytech modules protected shell shell clientresources lib xstyle node modules fstream package json dependency hierarchy intern geezer tgz root library digdug tgz decompress tgz tar tgz x fstream tgz vulnerable library vulnerability details fstream before is vulnerable to arbitrary file overwrite extracting tarballs containing a hardlink to a file that already exists in the system and a file that matches the hardlink will overwrite the system s file with the contents of the extracted file the fstream dirwriter function is vulnerable publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
13,261
3,698,863,772
IssuesEvent
2016-02-28 16:30:16
tfg-2016-jjmr/tfg
https://api.github.com/repos/tfg-2016-jjmr/tfg
opened
Create analysis document of the sprint "Release 1"
documentation
Quote from the tutor: " Alcance Planificacion inicial comentada Planficaciones final Analisis Desviaciones / causas Posibles mejoras. "
1.0
Create analysis document of the sprint "Release 1" - Quote from the tutor: " Alcance Planificacion inicial comentada Planficaciones final Analisis Desviaciones / causas Posibles mejoras. "
non_process
create analysis document of the sprint release quote from the tutor alcance planificacion inicial comentada planficaciones final analisis desviaciones causas posibles mejoras
0
4,910
7,785,666,008
IssuesEvent
2018-06-06 16:29:26
MattDeReno/SAE-KnowledgeHubs
https://api.github.com/repos/MattDeReno/SAE-KnowledgeHubs
opened
As a KH Manager, I have identified 3 part-time content curators
Editorial Process
I need content curators for the following content types: - [ ] News - [ ] Training, Events - [ ] Directory
1.0
As a KH Manager, I have identified 3 part-time content curators - I need content curators for the following content types: - [ ] News - [ ] Training, Events - [ ] Directory
process
as a kh manager i have identified part time content curators i need content curators for the following content types news training events directory
1
107,900
9,247,745,618
IssuesEvent
2019-03-15 02:19:12
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Editing GDNS entry (associated with project) and adding a new project , results in GDNS entry associated with only the new project.
kind/bug-qa status/blocker status/ready-for-review status/resolved status/to-test status/working version/2.0
Rancher version - Build from master Steps to reproduce the problem: Create a GDNS entry associated with project say p1. Edit GDNS entry and add another project say p2. After Edit is done , GDNS entry gets associated with only p2.
1.0
Editing GDNS entry (associated with project) and adding a new project , results in GDNS entry associated with only the new project. - Rancher version - Build from master Steps to reproduce the problem: Create a GDNS entry associated with project say p1. Edit GDNS entry and add another project say p2. After Edit is done , GDNS entry gets associated with only p2.
non_process
editing gdns entry associated with project and adding a new project results in gdns entry associated with only the new project rancher version build from master steps to reproduce the problem create a gdns entry associated with project say edit gdns entry and add another project say after edit is done gdns entry gets associated with only
0
7,173
10,316,256,026
IssuesEvent
2019-08-30 09:34:12
google/data-transfer-project
https://api.github.com/repos/google/data-transfer-project
opened
Automatically check google-java-format on pull requests
process
We recommend google-java-format in the developer documentation for the project but we do not currently automatically check contributions for code style. We should add a Travis CI step to check the formatting on each pull request.
1.0
Automatically check google-java-format on pull requests - We recommend google-java-format in the developer documentation for the project but we do not currently automatically check contributions for code style. We should add a Travis CI step to check the formatting on each pull request.
process
automatically check google java format on pull requests we recommend google java format in the developer documentation for the project but we do not currently automatically check contributions for code style we should add a travis ci step to check the formatting on each pull request
1
55,195
7,965,191,334
IssuesEvent
2018-07-14 04:29:16
numpy/numpy
https://api.github.com/repos/numpy/numpy
closed
Testing guideline page is obsolete and not built in main docs
03 - Documentation component: Documentation
As per @rgommers's advice, I am moving scipy/scipy#8606 to here. The numpy testing guideline does not appear to be built with the other docs. It renders adequately on GitHub (https://github.com/numpy/numpy/blob/master/doc/TESTS.rst.txt), but seems out of place. This is noticeable in particular because the scipy docs link to the GitHub page. The section on unit tests under the "Contributing New Code" section of the "Contributing to SciPy" page (http://scipy.github.io/devdocs/hacking.html#contributing-new-code) has a link to the "testing guidelines", which points here. Given that this page (probably among others) is important enough to link across projects, it would probably be a good idea to include it in the generated docs. On top of the fact that the page is missing, it still refers to the old `nose` testing setup, and should probably be updated to include the switchover to `py.test`.
2.0
Testing guideline page is obsolete and not built in main docs - As per @rgommers's advice, I am moving scipy/scipy#8606 to here. The numpy testing guideline does not appear to be built with the other docs. It renders adequately on GitHub (https://github.com/numpy/numpy/blob/master/doc/TESTS.rst.txt), but seems out of place. This is noticeable in particular because the scipy docs link to the GitHub page. The section on unit tests under the "Contributing New Code" section of the "Contributing to SciPy" page (http://scipy.github.io/devdocs/hacking.html#contributing-new-code) has a link to the "testing guidelines", which points here. Given that this page (probably among others) is important enough to link across projects, it would probably be a good idea to include it in the generated docs. On top of the fact that the page is missing, it still refers to the old `nose` testing setup, and should probably be updated to include the switchover to `py.test`.
non_process
testing guideline page is obsolete and not built in main docs as per rgommers s advice i am moving scipy scipy to here the numpy testing guideline does not appear to be built with the other docs it renders adequately on github but seems out of place this is noticeable in particular because the scipy docs link to the github page the section on unit tests under the contributing new code section of the contributing to scipy page has a link to the testing guidelines which points here given that this page probably among others is important enough to link across projects it would probably be a good idea to include it in the generated docs on top of the fact that the page is missing it still refers to the old nose testing setup and should probably be updated to include the switchover to py test
0
11,746
14,583,257,310
IssuesEvent
2020-12-18 13:43:46
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
Make clear what text is a link within participant manager
Feature request P1 Participant manager Process: Fixed Process: Tested QA Process: Tested dev UI
Many of the tables within the participant manager UI have one column (typically the 'first name' column) that are links to the detailed information about that entry. However, there is no visual indication that this is a link and is not discoverable. We should add some visual indicator that this is a link and is the place that you click to edit this entry. Example below. ![Screenshot 2020-12-15 at 1 04 51 PM](https://user-images.githubusercontent.com/35972680/102254278-6c8c7580-3ed6-11eb-9df3-57fb4240a73d.png)
3.0
Make clear what text is a link within participant manager - Many of the tables within the participant manager UI have one column (typically the 'first name' column) that are links to the detailed information about that entry. However, there is no visual indication that this is a link and is not discoverable. We should add some visual indicator that this is a link and is the place that you click to edit this entry. Example below. ![Screenshot 2020-12-15 at 1 04 51 PM](https://user-images.githubusercontent.com/35972680/102254278-6c8c7580-3ed6-11eb-9df3-57fb4240a73d.png)
process
make clear what text is a link within participant manager many of the tables within the participant manager ui have one column typically the first name column that are links to the detailed information about that entry however there is no visual indication that this is a link and is not discoverable we should add some visual indicator that this is a link and is the place that you click to edit this entry example below
1
61,293
14,621,063,937
IssuesEvent
2020-12-22 20:53:34
SmartBear/idea-collaborator-plugin
https://api.github.com/repos/SmartBear/idea-collaborator-plugin
opened
CVE-2020-8840 (High) detected in jackson-databind-2.5.0.jar
security vulnerability
## CVE-2020-8840 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.5.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: idea-collaborator-plugin/client/lib/jackson-databind-2.5.0.jar,idea-collaborator-plugin/collabplugin/collaborator/collaborator/lib/jackson-databind-2.5.0.jar,idea-collaborator-plugin/collaborator-0_7-BETA/collaborator/lib/jackson-databind-2.5.0.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.5.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SmartBear/idea-collaborator-plugin/commit/3e67fb2d437ffeadf07751b7979f4e35dbc282a2">3e67fb2d437ffeadf07751b7979f4e35dbc282a2</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.0.0 through 2.9.10.2 lacks certain xbean-reflect/JNDI blocking, as demonstrated by org.apache.xbean.propertyeditor.JndiConverter. <p>Publish Date: 2020-02-10 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8840>CVE-2020-8840</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2620">https://github.com/FasterXML/jackson-databind/issues/2620</a></p> <p>Release Date: 2020-02-10</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.0","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.5.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.3"}],"vulnerabilityIdentifier":"CVE-2020-8840","vulnerabilityDetails":"FasterXML jackson-databind 2.0.0 through 2.9.10.2 lacks certain xbean-reflect/JNDI blocking, as demonstrated by org.apache.xbean.propertyeditor.JndiConverter.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8840","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-8840 (High) detected in jackson-databind-2.5.0.jar - ## CVE-2020-8840 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.5.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: idea-collaborator-plugin/client/lib/jackson-databind-2.5.0.jar,idea-collaborator-plugin/collabplugin/collaborator/collaborator/lib/jackson-databind-2.5.0.jar,idea-collaborator-plugin/collaborator-0_7-BETA/collaborator/lib/jackson-databind-2.5.0.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.5.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SmartBear/idea-collaborator-plugin/commit/3e67fb2d437ffeadf07751b7979f4e35dbc282a2">3e67fb2d437ffeadf07751b7979f4e35dbc282a2</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.0.0 through 2.9.10.2 lacks certain xbean-reflect/JNDI blocking, as demonstrated by org.apache.xbean.propertyeditor.JndiConverter. <p>Publish Date: 2020-02-10 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8840>CVE-2020-8840</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2620">https://github.com/FasterXML/jackson-databind/issues/2620</a></p> <p>Release Date: 2020-02-10</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.0","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.5.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.3"}],"vulnerabilityIdentifier":"CVE-2020-8840","vulnerabilityDetails":"FasterXML jackson-databind 2.0.0 through 2.9.10.2 lacks certain xbean-reflect/JNDI blocking, as demonstrated by org.apache.xbean.propertyeditor.JndiConverter.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8840","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library idea collaborator plugin client lib jackson databind jar idea collaborator plugin collabplugin collaborator collaborator lib jackson databind jar idea collaborator plugin collaborator beta collaborator lib jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind through lacks certain xbean reflect jndi blocking as demonstrated by org apache xbean propertyeditor jndiconverter publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind through lacks certain xbean reflect jndi blocking as demonstrated by org apache xbean propertyeditor jndiconverter vulnerabilityurl
0
8,653
11,790,982,894
IssuesEvent
2020-03-17 20:04:19
prisma/prisma-client-js
https://api.github.com/repos/prisma/prisma-client-js
reopened
Casing for Prisma Client model and field names
bug/2-confirmed kind/bug process/product topic: dx
Similar to https://github.com/prisma/prisma2/issues/1851, but Client level > Upper case models produce Prisma client call with a weird casing: > > ```prisma > model ATOMS { > ATOM_TYPE Int? > CHARGE Float? > ELEMENT String? > ID Int @default(0) @id > DRUG_ID DRUGS? > bONDS_ATOMSToBONDS_ATOM1_ID BONDS[] @relation("ATOMSToBONDS_ATOM1_ID") > bONDS_ATOMSToBONDS_ATOM2_ID BONDS[] @relation("ATOMSToBONDS_ATOM2_ID") > rING_ATOM RING_ATOM[] > > @@index([DRUG_ID], name: "ATOMS_DRUG_ID") > } > ``` > > Casing in generated client: > > ![image](https://user-images.githubusercontent.com/746482/75041435-dd345280-54bc-11ea-8239-59bcb7c76581.png) > > Same applies for field names like `bONDS_ATOMSToBONDS_ATOM1_ID`. Right now this exists: https://github.com/prisma/prisma-client-js/blob/ffa2f18bd5dc9390380f01d3f68a9c81adeb1086/packages/photon/src/runtime/getPrismaClient.ts#L349 This should be removed, and everything that is similar: The schema is the source of truth for capitalization of model and field names.
1.0
Casing for Prisma Client model and field names - Similar to https://github.com/prisma/prisma2/issues/1851, but Client level > Upper case models produce Prisma client call with a weird casing: > > ```prisma > model ATOMS { > ATOM_TYPE Int? > CHARGE Float? > ELEMENT String? > ID Int @default(0) @id > DRUG_ID DRUGS? > bONDS_ATOMSToBONDS_ATOM1_ID BONDS[] @relation("ATOMSToBONDS_ATOM1_ID") > bONDS_ATOMSToBONDS_ATOM2_ID BONDS[] @relation("ATOMSToBONDS_ATOM2_ID") > rING_ATOM RING_ATOM[] > > @@index([DRUG_ID], name: "ATOMS_DRUG_ID") > } > ``` > > Casing in generated client: > > ![image](https://user-images.githubusercontent.com/746482/75041435-dd345280-54bc-11ea-8239-59bcb7c76581.png) > > Same applies for field names like `bONDS_ATOMSToBONDS_ATOM1_ID`. Right now this exists: https://github.com/prisma/prisma-client-js/blob/ffa2f18bd5dc9390380f01d3f68a9c81adeb1086/packages/photon/src/runtime/getPrismaClient.ts#L349 This should be removed, and everything that is similar: The schema is the source of truth for capitalization of model and field names.
process
casing for prisma client model and field names similar to but client level upper case models produce prisma client call with a weird casing prisma model atoms atom type int charge float element string id int default id drug id drugs bonds atomstobonds id bonds relation atomstobonds id bonds atomstobonds id bonds relation atomstobonds id ring atom ring atom index name atoms drug id casing in generated client same applies for field names like bonds atomstobonds id right now this exists this should be removed and everything that is similar the schema is the source of truth for capitalization of model and field names
1
1,718
4,366,769,861
IssuesEvent
2016-08-03 15:14:18
SIMEXP/niak
https://api.github.com/repos/SIMEXP/niak
closed
replace structural processing pipeline by minc-toolkit
enhancement preprocessing
currently niak as its own structural processing pipeline, forked from CIVET with a few added bits to it (notably a rudimentary brain extraction tool). This causes many problems, and is too large of a project to be maintained by SIMEXP. Needs to be replaced (urgently) by the minc-toolkit pipeline.
1.0
replace structural processing pipeline by minc-toolkit - currently niak as its own structural processing pipeline, forked from CIVET with a few added bits to it (notably a rudimentary brain extraction tool). This causes many problems, and is too large of a project to be maintained by SIMEXP. Needs to be replaced (urgently) by the minc-toolkit pipeline.
process
replace structural processing pipeline by minc toolkit currently niak as its own structural processing pipeline forked from civet with a few added bits to it notably a rudimentary brain extraction tool this causes many problems and is too large of a project to be maintained by simexp needs to be replaced urgently by the minc toolkit pipeline
1