Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
410,637
27,795,493,546
IssuesEvent
2023-03-17 12:13:42
mindsdb/mindsdb
https://api.github.com/repos/mindsdb/mindsdb
reopened
[Docs] Update a title of the `MySQL Question Answering` tutorial
help wanted good first issue documentation first-timers-only
## Instructions :page_facing_up: Here are the step-by-step instructions: 1. Go to the file `/docs/nlp/question-answering-inside-mysql-with-openai.mdx`. 2. Change the values of `title` and `sidebarTitle` to read as follows: ``` --- title: Question Answering with MindsDB and OpenAI using SQL sidebarTitle: Question Answering using SQL --- ``` 3. Save the changes and create a PR. ## Hackathon Issue :loudspeaker: MindsDB has organized a hackathon to let in more contributors to the in-database ML world. Each hackathon issue is worth a certain amount of credits that will bring you prizes by the end of the MindsDB Hackathon. Check out the [MindsDB Hackathon rules](https://mindsdb.com/mindsdb-hackathon)! ## The https://github.com/mindsdb/mindsdb/labels/first-timers-only Label We are happy to welcome you on board! Please take a look at the rules below for first-time contributors. 1. You can solve only one issue labeled as https://github.com/mindsdb/mindsdb/labels/first-timers-only. After that, please look at other issues labeled as https://github.com/mindsdb/mindsdb/labels/good%20first%20issue, https://github.com/mindsdb/mindsdb/labels/help%20wanted, or https://github.com/mindsdb/mindsdb/labels/integration. 2. After you create your first PR in the MindsDB repository, please sign our CLA to become a MindsDB contributor. You can do that by leaving a comment that contains the following: `I have read the CLA Document and I hereby sign the CLA` Thank you for contributing to MindsDB!
1.0
[Docs] Update a title of the `MySQL Question Answering` tutorial - ## Instructions :page_facing_up: Here are the step-by-step instructions: 1. Go to the file `/docs/nlp/question-answering-inside-mysql-with-openai.mdx`. 2. Change the values of `title` and `sidebarTitle` to read as follows: ``` --- title: Question Answering with MindsDB and OpenAI using SQL sidebarTitle: Question Answering using SQL --- ``` 3. Save the changes and create a PR. ## Hackathon Issue :loudspeaker: MindsDB has organized a hackathon to let in more contributors to the in-database ML world. Each hackathon issue is worth a certain amount of credits that will bring you prizes by the end of the MindsDB Hackathon. Check out the [MindsDB Hackathon rules](https://mindsdb.com/mindsdb-hackathon)! ## The https://github.com/mindsdb/mindsdb/labels/first-timers-only Label We are happy to welcome you on board! Please take a look at the rules below for first-time contributors. 1. You can solve only one issue labeled as https://github.com/mindsdb/mindsdb/labels/first-timers-only. After that, please look at other issues labeled as https://github.com/mindsdb/mindsdb/labels/good%20first%20issue, https://github.com/mindsdb/mindsdb/labels/help%20wanted, or https://github.com/mindsdb/mindsdb/labels/integration. 2. After you create your first PR in the MindsDB repository, please sign our CLA to become a MindsDB contributor. You can do that by leaving a comment that contains the following: `I have read the CLA Document and I hereby sign the CLA` Thank you for contributing to MindsDB!
non_process
update a title of the mysql question answering tutorial instructions page facing up here are the step by step instructions go to the file docs nlp question answering inside mysql with openai mdx change the values of title and sidebartitle to read as follows title question answering with mindsdb and openai using sql sidebartitle question answering using sql save the changes and create a pr hackathon issue loudspeaker mindsdb has organized a hackathon to let in more contributors to the in database ml world each hackathon issue is worth a certain amount of credits that will bring you prizes by the end of the mindsdb hackathon check out the the label we are happy to welcome you on board please take a look at the rules below for first time contributors you can solve only one issue labeled as after that please look at other issues labeled as or after you create your first pr in the mindsdb repository please sign our cla to become a mindsdb contributor you can do that by leaving a comment that contains the following i have read the cla document and i hereby sign the cla thank you for contributing to mindsdb
0
624,131
19,687,437,674
IssuesEvent
2022-01-12 00:31:33
apcountryman/picolibrary
https://api.github.com/repos/apcountryman/picolibrary
opened
Add unit testing pseudo-random value generation
priority-normal status-awaiting_development type-feature
Add unit testing pseudo-random value generation (`::picolibrary::Testing::Unit::random()`). - [ ] The `random()` function should be defined in the `include/picolibrary/testing/unit/random.h`/`source/picolibrary/testing/unit/random.cc` header/source file pair - [ ] The `random()` function should have the following overloads: - [ ] `template<typename T> auto random( T min, T max ) -> T;`: Generate a pseudo-random value within the specified range - [ ] `template<typename T> auto random( T min ) -> T;`: Generate a pseudo-random value greater than or equal to a minimum value - [ ] `template<typename T> auto random() -> T;`: Generate a pseudo-random value - [ ] The `random()` function should have the following specializations: - [ ] `template<> auto random<bool>() -> bool;`: Generate a pseudo-random bool - [ ] `template<> auto random<char>() -> char;`: Generate a pseudo-random character in the range [' ','~']
1.0
Add unit testing pseudo-random value generation - Add unit testing pseudo-random value generation (`::picolibrary::Testing::Unit::random()`). - [ ] The `random()` function should be defined in the `include/picolibrary/testing/unit/random.h`/`source/picolibrary/testing/unit/random.cc` header/source file pair - [ ] The `random()` function should have the following overloads: - [ ] `template<typename T> auto random( T min, T max ) -> T;`: Generate a pseudo-random value within the specified range - [ ] `template<typename T> auto random( T min ) -> T;`: Generate a pseudo-random value greater than or equal to a minimum value - [ ] `template<typename T> auto random() -> T;`: Generate a pseudo-random value - [ ] The `random()` function should have the following specializations: - [ ] `template<> auto random<bool>() -> bool;`: Generate a pseudo-random bool - [ ] `template<> auto random<char>() -> char;`: Generate a pseudo-random character in the range [' ','~']
non_process
add unit testing pseudo random value generation add unit testing pseudo random value generation picolibrary testing unit random the random function should be defined in the include picolibrary testing unit random h source picolibrary testing unit random cc header source file pair the random function should have the following overloads template auto random t min t max t generate a pseudo random value within the specified range template auto random t min t generate a pseudo random value greater than or equal to a minimum value template auto random t generate a pseudo random value the random function should have the following specializations template auto random bool generate a pseudo random bool template auto random char generate a pseudo random character in the range
0
14,224
17,145,268,054
IssuesEvent
2021-07-13 14:01:33
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
qgis:executesql cannot find input layer if it is a temporary layer with accentuated characters
Bug Processing
Hi, When using qgis:executesql algorithm from processing toolbox, if input layer is a temporary layer created by another algorithm with accentuated characters, such as "Reprojeté" created by reprojectlayer algorithm in french, I get the following error message : <p><b>virtual:</b> Cannot find layer Reprojet%C3%A9_e83003db_00e8_4abd_8b81_365ecd89c9a4 "Reprojeté" is the name set automatically for output temporary layers created by reprojectlayer in french. When using executesql with a temporary layer without any accentuated characters, everything goes fine. I'm using QGIS 3.20 with Ubuntu 20.04. Hope this is clear, thanks for your time !
1.0
qgis:executesql cannot find input layer if it is a temporary layer with accentuated characters - Hi, When using qgis:executesql algorithm from processing toolbox, if input layer is a temporary layer created by another algorithm with accentuated characters, such as "Reprojeté" created by reprojectlayer algorithm in french, I get the following error message : <p><b>virtual:</b> Cannot find layer Reprojet%C3%A9_e83003db_00e8_4abd_8b81_365ecd89c9a4 "Reprojeté" is the name set automatically for output temporary layers created by reprojectlayer in french. When using executesql with a temporary layer without any accentuated characters, everything goes fine. I'm using QGIS 3.20 with Ubuntu 20.04. Hope this is clear, thanks for your time !
process
qgis executesql cannot find input layer if it is a temporary layer with accentuated characters hi when using qgis executesql algorithm from processing toolbox if input layer is a temporary layer created by another algorithm with accentuated characters such as reprojeté created by reprojectlayer algorithm in french i get the following error message virtual cannot find layer reprojet reprojeté is the name set automatically for output temporary layers created by reprojectlayer in french when using executesql with a temporary layer without any accentuated characters everything goes fine i m using qgis with ubuntu hope this is clear thanks for your time
1
17,564
23,377,430,485
IssuesEvent
2022-08-11 05:44:52
ClickHouse/ClickHouse
https://api.github.com/repos/ClickHouse/ClickHouse
closed
Rows processed by the right side of IN operator are not counted
st-fixed unexpected behaviour comp-processors
The first query uses a subselect for IN, and the second one just uses constants (disregard weird coding with `materialize` etc, this is to work around other bugs). Note that the number of processed rows is the same, despite the fact that the subselect processes another 81k rows. ``` /4/ :) SELECT UserID FROM hits WHERE (CounterID, EventTime) IN ( SELECT CounterID, max(EventTime) FROM hits WHERE (CounterID = 25703952) AND (EventDate < '2014-03-20') GROUP BY CounterID ) ┌─────────────UserID─┐ │ 285721602179270679 │ └────────────────────┘ 1 rows in set. Elapsed: 0.008 sec. Processed 155.65 thousand rows, 1.26 MB (20.21 million rows/s., 164.02 MB/s.) /4/ :) SELECT UserID FROM hits WHERE (CounterID, EventTime) IN ( SELECT materialize(toUInt32(25703952)) AS CounterID, materialize(toDateTime('2014-03-19 23:59:58')) AS EventTime FROM numbers(1) ) ┌─────────────UserID─┐ │ 285721602179270679 │ └────────────────────┘ 1 rows in set. Elapsed: 0.006 sec. Processed 155.65 thousand rows, 1.26 MB (28.14 million rows/s., 228.40 MB/s.) ```
1.0
Rows processed by the right side of IN operator are not counted - The first query uses a subselect for IN, and the second one just uses constants (disregard weird coding with `materialize` etc, this is to work around other bugs). Note that the number of processed rows is the same, despite the fact that the subselect processes another 81k rows. ``` /4/ :) SELECT UserID FROM hits WHERE (CounterID, EventTime) IN ( SELECT CounterID, max(EventTime) FROM hits WHERE (CounterID = 25703952) AND (EventDate < '2014-03-20') GROUP BY CounterID ) ┌─────────────UserID─┐ │ 285721602179270679 │ └────────────────────┘ 1 rows in set. Elapsed: 0.008 sec. Processed 155.65 thousand rows, 1.26 MB (20.21 million rows/s., 164.02 MB/s.) /4/ :) SELECT UserID FROM hits WHERE (CounterID, EventTime) IN ( SELECT materialize(toUInt32(25703952)) AS CounterID, materialize(toDateTime('2014-03-19 23:59:58')) AS EventTime FROM numbers(1) ) ┌─────────────UserID─┐ │ 285721602179270679 │ └────────────────────┘ 1 rows in set. Elapsed: 0.006 sec. Processed 155.65 thousand rows, 1.26 MB (28.14 million rows/s., 228.40 MB/s.) ```
process
rows processed by the right side of in operator are not counted the first query uses a subselect for in and the second one just uses constants disregard weird coding with materialize etc this is to work around other bugs note that the number of processed rows is the same despite the fact that the subselect processes another rows select userid from hits where counterid eventtime in select counterid max eventtime from hits where counterid and eventdate group by counterid ┌─────────────userid─┐ │ │ └────────────────────┘ rows in set elapsed sec processed thousand rows mb million rows s mb s select userid from hits where counterid eventtime in select materialize as counterid materialize todatetime as eventtime from numbers ┌─────────────userid─┐ │ │ └────────────────────┘ rows in set elapsed sec processed thousand rows mb million rows s mb s
1
3,195
6,261,571,104
IssuesEvent
2017-07-15 01:13:36
PHPSocialNetwork/phpfastcache
https://api.github.com/repos/PHPSocialNetwork/phpfastcache
closed
performance regression with v6
6.0 [-_-] In Process ^_^ enhancement ^_^ Fixed
### Configuration: PhpFastCache version: 6.0.1 / 5.0.16 PHP version: 7.0 Operating system: Linux / Windows #### Issue description: I've updated a project to phpfastcache v6 and noticed a perfomance regression. The project is running on windows using the wincache driver with php 7.0. The perfomance regression compared to 5.0.16 is between 15%-25%. I've created a test case, which I've ran under Linux as well using the files backend. The numbers are comparable. v5.0.16: first run with empty cache: 14.666780948639 second run with full cache: 0.75569796562195 v6.0.1: first run with empty cache: 17.629030942917 second run with full cache: 1.1219041347504 The code I used writes 100.000 entries and then reads them again from the cache: ``` <?php require_once('vendor/autoload.php'); use \phpFastCache\CacheManager; $cache = CacheManager::getInstance('files'); $cycles = 100000; function compute_str($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = str_replace(array('{', '}', '(', ')', '/', '\\', '@', ':'), '_', $str); return $key; } function compute_md5($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = md5($str); return $key; } function compute_sha1($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = sha1($str); return $key; } function compute($arg) { return compute_sha1($arg); } $before = microtime(true); for ($i=0; $i<$cycles; $i++) { $key = compute($i); $cachedNum = $cache->getItem($key); if (is_null($cachedNum->get())) { $cachedNum->set($i*$i); $cache->save($cachedNum); } else { $num = $cachedNum->get(); if ($num != $i*$i) { print ("Something went wrong: " . $i . " -> " . $num); } } } $after = microtime(true); $duration = $after - $before; print ('first run with empty cache: ' . $duration . "\n"); $before = microtime(true); for ($i=0; $i<$cycles; $i++) { $key = compute($i); $cachedNum = $cache->getItem($key); if (is_null($cachedNum->get())) { $cachedNum->set($i*$i); $cache->save($cachedNum); } else { $num = $cachedNum->get(); if ($num != $i*$i) { print ("Something went wrong: " . $i . " -> " . $num); } } } $after = microtime(true ); $duration = $after - $before; print ('second run with full cache: ' . $duration . "\n"); $cache->clear(); ```
1.0
performance regression with v6 - ### Configuration: PhpFastCache version: 6.0.1 / 5.0.16 PHP version: 7.0 Operating system: Linux / Windows #### Issue description: I've updated a project to phpfastcache v6 and noticed a perfomance regression. The project is running on windows using the wincache driver with php 7.0. The perfomance regression compared to 5.0.16 is between 15%-25%. I've created a test case, which I've ran under Linux as well using the files backend. The numbers are comparable. v5.0.16: first run with empty cache: 14.666780948639 second run with full cache: 0.75569796562195 v6.0.1: first run with empty cache: 17.629030942917 second run with full cache: 1.1219041347504 The code I used writes 100.000 entries and then reads them again from the cache: ``` <?php require_once('vendor/autoload.php'); use \phpFastCache\CacheManager; $cache = CacheManager::getInstance('files'); $cycles = 100000; function compute_str($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = str_replace(array('{', '}', '(', ')', '/', '\\', '@', ':'), '_', $str); return $key; } function compute_md5($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = md5($str); return $key; } function compute_sha1($arg) { $str = __FUNCTION__ . '-' . $arg . '-' . 'michael@test.example.com'; $key = sha1($str); return $key; } function compute($arg) { return compute_sha1($arg); } $before = microtime(true); for ($i=0; $i<$cycles; $i++) { $key = compute($i); $cachedNum = $cache->getItem($key); if (is_null($cachedNum->get())) { $cachedNum->set($i*$i); $cache->save($cachedNum); } else { $num = $cachedNum->get(); if ($num != $i*$i) { print ("Something went wrong: " . $i . " -> " . $num); } } } $after = microtime(true); $duration = $after - $before; print ('first run with empty cache: ' . $duration . "\n"); $before = microtime(true); for ($i=0; $i<$cycles; $i++) { $key = compute($i); $cachedNum = $cache->getItem($key); if (is_null($cachedNum->get())) { $cachedNum->set($i*$i); $cache->save($cachedNum); } else { $num = $cachedNum->get(); if ($num != $i*$i) { print ("Something went wrong: " . $i . " -> " . $num); } } } $after = microtime(true ); $duration = $after - $before; print ('second run with full cache: ' . $duration . "\n"); $cache->clear(); ```
process
performance regression with configuration phpfastcache version php version operating system linux windows issue description i ve updated a project to phpfastcache and noticed a perfomance regression the project is running on windows using the wincache driver with php the perfomance regression compared to is between i ve created a test case which i ve ran under linux as well using the files backend the numbers are comparable first run with empty cache second run with full cache first run with empty cache second run with full cache the code i used writes entries and then reads them again from the cache php require once vendor autoload php use phpfastcache cachemanager cache cachemanager getinstance files cycles function compute str arg str function arg michael test example com key str replace array str return key function compute arg str function arg michael test example com key str return key function compute arg str function arg michael test example com key str return key function compute arg return compute arg before microtime true for i i cycles i key compute i cachednum cache getitem key if is null cachednum get cachednum set i i cache save cachednum else num cachednum get if num i i print something went wrong i num after microtime true duration after before print first run with empty cache duration n before microtime true for i i cycles i key compute i cachednum cache getitem key if is null cachednum get cachednum set i i cache save cachednum else num cachednum get if num i i print something went wrong i num after microtime true duration after before print second run with full cache duration n cache clear
1
15,220
19,088,283,017
IssuesEvent
2021-11-29 09:14:14
opensafely-core/job-server
https://api.github.com/repos/opensafely-core/job-server
closed
Add Section Links to ApplicationDetail in Staff Area
application-process
Mirroring the confirmation page a User sees we need to add `Change` links to each section of the ApplicationDetail page to let a Staff user jump to that section.
1.0
Add Section Links to ApplicationDetail in Staff Area - Mirroring the confirmation page a User sees we need to add `Change` links to each section of the ApplicationDetail page to let a Staff user jump to that section.
process
add section links to applicationdetail in staff area mirroring the confirmation page a user sees we need to add change links to each section of the applicationdetail page to let a staff user jump to that section
1
35,433
31,279,520,126
IssuesEvent
2023-08-22 08:42:17
arduino/arduino-fwuploader
https://api.github.com/repos/arduino/arduino-fwuploader
closed
:open_umbrella: Umbrella: migrate `Nina` to plugin system
type: enhancement topic: infrastructure
## Phase 1 - [x] #188 - [x] #191 - [x] #196 - [x] #193 ## Phase 2 - [x] #195 ## Phase 3 - [x] #202 - [ ] Make a new release of `arduino-fwuploader`
1.0
:open_umbrella: Umbrella: migrate `Nina` to plugin system - ## Phase 1 - [x] #188 - [x] #191 - [x] #196 - [x] #193 ## Phase 2 - [x] #195 ## Phase 3 - [x] #202 - [ ] Make a new release of `arduino-fwuploader`
non_process
open umbrella umbrella migrate nina to plugin system phase phase phase make a new release of arduino fwuploader
0
14,750
18,020,485,077
IssuesEvent
2021-09-16 18:45:09
googleapis/gax-go
https://api.github.com/repos/googleapis/gax-go
closed
v2.1.0 is released but Version is still "2.0.5" ?
type: process priority: p2
https://pkg.go.dev/github.com/googleapis/gax-go/v2 shows `Version: v2.1.0 Latest` but ``` const Version = "2.0.5" ``` tag 2.1.0 is unexpected one?
1.0
v2.1.0 is released but Version is still "2.0.5" ? - https://pkg.go.dev/github.com/googleapis/gax-go/v2 shows `Version: v2.1.0 Latest` but ``` const Version = "2.0.5" ``` tag 2.1.0 is unexpected one?
process
is released but version is still shows version latest but const version tag is unexpected one
1
5,990
8,805,374,729
IssuesEvent
2018-12-26 19:14:03
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Pre-processing of map <navtitle> elements different in DITA-OT >=2.0
bug preprocess priority/medium stale
Pre-processing of map `<navtitle>` elements different in DITA-OT >=2.0 After a couple of hours deep-dive I'm pretty sure that the pre-processing behaviour of populating the `<navtitle>` elements inside the pre-processed DITA maps during mappull has changed. I'm almost sure that this is not intentional behaviour and hope it can be fixed or worked around easily. Given the following minimalistic input. map.ditamap: ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Topic navtitle</navtitle> </topicmeta> </topicref> </map> ``` topic.dita: ``` xml <concept id="topic";> <title>Concept title</title> <conbody> <p>My concept</p> </conbody> </concept> ``` This will generate the following pre-processed version of map.ditamap in the temp folder (stripped debug attributes etc....): ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Topic navtitle</navtitle> <?ditaot gentext?> <linktext>Concept title</linktext> <?ditaot genshortdesc?> <shortdesc /> </topicmeta> </topicref> </map> ``` In previous DITA-OT versions the content would be: ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Concept title</navtitle&gt; ........ ``` The result of this changes is that all navigational items in output like an index.html TOC file or a CHM HHC file show the `<navtitle>` of the map and not the `<title>` of the topic while the `<navtitle>` of the map should only be used if the `@locktitle` attribute was set to 'yes'. To rule out some possible causes I tried the following: - Remove the `<navtitle>` element from the map: then it works as expected - Set a `@locktitle` attribute to 'no': no change - Add an empty `<navtitle>` element: no change, the empty navtitle is preserved still Most prominent questions: 1. Is this a known issue? 2. Can you please confirm it is not intentionally so should be changed? 3. Are there any (known) workarounds Please, note that I've tested on different DITA-OT versions and on different systems. All with the same result. Doing a deep-dive in mappullImpl.xsl proofs that the behaviour is not as expected.
1.0
Pre-processing of map <navtitle> elements different in DITA-OT >=2.0 - Pre-processing of map `<navtitle>` elements different in DITA-OT >=2.0 After a couple of hours deep-dive I'm pretty sure that the pre-processing behaviour of populating the `<navtitle>` elements inside the pre-processed DITA maps during mappull has changed. I'm almost sure that this is not intentional behaviour and hope it can be fixed or worked around easily. Given the following minimalistic input. map.ditamap: ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Topic navtitle</navtitle> </topicmeta> </topicref> </map> ``` topic.dita: ``` xml <concept id="topic";> <title>Concept title</title> <conbody> <p>My concept</p> </conbody> </concept> ``` This will generate the following pre-processed version of map.ditamap in the temp folder (stripped debug attributes etc....): ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Topic navtitle</navtitle> <?ditaot gentext?> <linktext>Concept title</linktext> <?ditaot genshortdesc?> <shortdesc /> </topicmeta> </topicref> </map> ``` In previous DITA-OT versions the content would be: ``` xml <map> <topicref href="topic.dita"> <topicmeta> <navtitle>Concept title</navtitle&gt; ........ ``` The result of this changes is that all navigational items in output like an index.html TOC file or a CHM HHC file show the `<navtitle>` of the map and not the `<title>` of the topic while the `<navtitle>` of the map should only be used if the `@locktitle` attribute was set to 'yes'. To rule out some possible causes I tried the following: - Remove the `<navtitle>` element from the map: then it works as expected - Set a `@locktitle` attribute to 'no': no change - Add an empty `<navtitle>` element: no change, the empty navtitle is preserved still Most prominent questions: 1. Is this a known issue? 2. Can you please confirm it is not intentionally so should be changed? 3. Are there any (known) workarounds Please, note that I've tested on different DITA-OT versions and on different systems. All with the same result. Doing a deep-dive in mappullImpl.xsl proofs that the behaviour is not as expected.
process
pre processing of map elements different in dita ot pre processing of map elements different in dita ot after a couple of hours deep dive i m pretty sure that the pre processing behaviour of populating the elements inside the pre processed dita maps during mappull has changed i m almost sure that this is not intentional behaviour and hope it can be fixed or worked around easily given the following minimalistic input map ditamap xml topic navtitle topic dita xml concept title my concept this will generate the following pre processed version of map ditamap in the temp folder stripped debug attributes etc xml topic navtitle concept title in previous dita ot versions the content would be xml concept title navtitle gt the result of this changes is that all navigational items in output like an index html toc file or a chm hhc file show the of the map and not the of the topic while the of the map should only be used if the locktitle attribute was set to yes to rule out some possible causes i tried the following remove the element from the map then it works as expected set a locktitle attribute to no no change add an empty element no change the empty navtitle is preserved still most prominent questions is this a known issue can you please confirm it is not intentionally so should be changed are there any known workarounds please note that i ve tested on different dita ot versions and on different systems all with the same result doing a deep dive in mappullimpl xsl proofs that the behaviour is not as expected
1
122,662
10,229,005,419
IssuesEvent
2019-08-17 08:32:44
Joystream/joystream
https://api.github.com/repos/Joystream/joystream
opened
Rome: Live Milestones
release release-plan rome-testnet
## Live Milestones The Rome Release Plan contains a set of Milestones, with a target date for each. Experience have shown that: - we might not be able to follow these - we may want to add or remove certain items - we may want to adjust the scope of each milestone This approach, like with `Tracking Issues`, gives us some flexibility and will make it easier to keep track of our progress. <!-- [Rome Release Plan](https://github.com/Joystream/joystream/tree/master/testnets/rome) --> #### Live Milestone Table | Live Date | Event |Completed Date| |:---------:|------------------------------------------------------------------|:------------:| | 21.08.19 | [Rome Announced](#rome-announced) | NA | | 26.08.19 | [Spec Draft](#spec-draft) | NA | | 14.10.19 | [Produce Sub-System Test Checklist](#produce-sub-system-test-checklist) | NA | | 18.10.19 | [Perform Sub-system Test](#perform-sub-system-test) | NA | | 21.10.19 | [Produce Final Test Checklist](#produce-final-test-checklist) | NA | | 24.10.19 | [Spec Release](#spec-release) | NA | | 25.10.19 | [Perform Final Test](#perform-final-test) | NA | | 28.10.19 | [Produce Release Checklist](#produce-release-checklist) | NA | | 28.10.19 | [Snapshot and kill Acropolis](#snapshot-and-kill-acropolis) | NA | | 01.11.19 | [Launch Ready](#launch-ready) | NA | | 04.11.19 | [Release](#release) | NA | ## Milestones #### Rome Announced - **Description:** Announce the Rome upcoming testnet, with release plan and theme, and website update. - **Tasks:** - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - **Target Date:** 21.08.19 - [ ] **Completed Date:** NA #### Spec Draft - **Description:** Create and mark for review, a PR for the Rome specs - **Tasks:** - See [Tracking Issue 1.](https://github.com/Joystream/joystream/issues/95) - See [Tracking Issue 2.](https://github.com/Joystream/joystream/issues/96) - **Target Date:** 26.08.19 - [ ] **Completed Date:** NA #### Produce Sub-System Test Checklist - **Description:** Finalize and publish the requirements, procedure and checklists for the [Sub-System Test](#perform-sub-system-test) - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - **Target Date:** 14.10.19 - [ ] **Completed Date:** NA #### Perform Sub-system Test - **Description:** Test all sub-systems/software separately on the [`staging-reckless`](#staging-testnets) testnet(s) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Sub-System Test Checklist (Add link to Issue) - **Target Date:** 18.10.19 - [ ] **Completed Date:** NA **Note** After this test, only bugfix PRs can be merged to master branch of relevant repos. #### Produce Final Test Checklist - **Description:** Finalize and publish the requirements, procedure and checklists for the [Final Test](#perform-final-test) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - **Target Date:** 21.10.19 - [ ] **Completed Date:** NA #### Spec Release - **Description:** Release/merge the PR for the Rome specs - **Manager:** **Bedeho** - **Tasks:** - See [Tracking Issue 1.](https://github.com/Joystream/joystream/issues/95) - See [Tracking Issue 2.](https://github.com/Joystream/joystream/issues/96) - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - **Target Date:** 24.10.19 - [ ] **Completed Date:** NA #### Perform Final Test - **Description:** Upgrade the [`staging-lts`](#staging-testnets) testnet runtime to "Rome", and perform a full feature test. - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Final Test Checklist (Add link to Issue) - **Target Date:** 25.10.19 - [ ] **Completed Date:** NA #### Produce Release Checklist - **Description:** Finalize and publish the checklist for the [Release](#release) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Release Checklist (Add link to Issue) - **Target Date:** 28.10.19 - [ ] **Completed Date:** NA #### Snapshot and Kill Acropolis - **Description:** Take a snapshot of the Acropolis Chain State, and stop all Jsgenesis hosted infrastructure. - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Release Checklist (Add link to Issue) - **Target Date:** 28.10.19 - [ ] **Completed Date:** NA #### Launch Ready - **Description:** Prepare and finalize **all** communications, website, guides and infrastructure. - **Manager:** **Martin** - **Tasks:** - See Release Checklist (Add link to Issue) - **Target Date:** 01.11.19 - [ ] **Completed Date:** NA #### Release - **Description:** Launch and deploy all necessary nodes/tools/infrastructure - **Manager:** **Martin** - **Tasks:** - See Release Checklist (Add link to Issue) - **Target Date:** 04.11.19 - [ ] **Completed Date:** NA
1.0
Rome: Live Milestones - ## Live Milestones The Rome Release Plan contains a set of Milestones, with a target date for each. Experience have shown that: - we might not be able to follow these - we may want to add or remove certain items - we may want to adjust the scope of each milestone This approach, like with `Tracking Issues`, gives us some flexibility and will make it easier to keep track of our progress. <!-- [Rome Release Plan](https://github.com/Joystream/joystream/tree/master/testnets/rome) --> #### Live Milestone Table | Live Date | Event |Completed Date| |:---------:|------------------------------------------------------------------|:------------:| | 21.08.19 | [Rome Announced](#rome-announced) | NA | | 26.08.19 | [Spec Draft](#spec-draft) | NA | | 14.10.19 | [Produce Sub-System Test Checklist](#produce-sub-system-test-checklist) | NA | | 18.10.19 | [Perform Sub-system Test](#perform-sub-system-test) | NA | | 21.10.19 | [Produce Final Test Checklist](#produce-final-test-checklist) | NA | | 24.10.19 | [Spec Release](#spec-release) | NA | | 25.10.19 | [Perform Final Test](#perform-final-test) | NA | | 28.10.19 | [Produce Release Checklist](#produce-release-checklist) | NA | | 28.10.19 | [Snapshot and kill Acropolis](#snapshot-and-kill-acropolis) | NA | | 01.11.19 | [Launch Ready](#launch-ready) | NA | | 04.11.19 | [Release](#release) | NA | ## Milestones #### Rome Announced - **Description:** Announce the Rome upcoming testnet, with release plan and theme, and website update. - **Tasks:** - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - **Target Date:** 21.08.19 - [ ] **Completed Date:** NA #### Spec Draft - **Description:** Create and mark for review, a PR for the Rome specs - **Tasks:** - See [Tracking Issue 1.](https://github.com/Joystream/joystream/issues/95) - See [Tracking Issue 2.](https://github.com/Joystream/joystream/issues/96) - **Target Date:** 26.08.19 - [ ] **Completed Date:** NA #### Produce Sub-System Test Checklist - **Description:** Finalize and publish the requirements, procedure and checklists for the [Sub-System Test](#perform-sub-system-test) - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - **Target Date:** 14.10.19 - [ ] **Completed Date:** NA #### Perform Sub-system Test - **Description:** Test all sub-systems/software separately on the [`staging-reckless`](#staging-testnets) testnet(s) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Sub-System Test Checklist (Add link to Issue) - **Target Date:** 18.10.19 - [ ] **Completed Date:** NA **Note** After this test, only bugfix PRs can be merged to master branch of relevant repos. #### Produce Final Test Checklist - **Description:** Finalize and publish the requirements, procedure and checklists for the [Final Test](#perform-final-test) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - **Target Date:** 21.10.19 - [ ] **Completed Date:** NA #### Spec Release - **Description:** Release/merge the PR for the Rome specs - **Manager:** **Bedeho** - **Tasks:** - See [Tracking Issue 1.](https://github.com/Joystream/joystream/issues/95) - See [Tracking Issue 2.](https://github.com/Joystream/joystream/issues/96) - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - **Target Date:** 24.10.19 - [ ] **Completed Date:** NA #### Perform Final Test - **Description:** Upgrade the [`staging-lts`](#staging-testnets) testnet runtime to "Rome", and perform a full feature test. - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Final Test Checklist (Add link to Issue) - **Target Date:** 25.10.19 - [ ] **Completed Date:** NA #### Produce Release Checklist - **Description:** Finalize and publish the checklist for the [Release](#release) - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Release Checklist (Add link to Issue) - **Target Date:** 28.10.19 - [ ] **Completed Date:** NA #### Snapshot and Kill Acropolis - **Description:** Take a snapshot of the Acropolis Chain State, and stop all Jsgenesis hosted infrastructure. - **Manager:** **Martin** - **Tasks:** - See [Tracking Issue 6.](https://github.com/Joystream/joystream/issues/100) - See [Tracking Issue 7.](https://github.com/Joystream/joystream/issues/101) - See Release Checklist (Add link to Issue) - **Target Date:** 28.10.19 - [ ] **Completed Date:** NA #### Launch Ready - **Description:** Prepare and finalize **all** communications, website, guides and infrastructure. - **Manager:** **Martin** - **Tasks:** - See Release Checklist (Add link to Issue) - **Target Date:** 01.11.19 - [ ] **Completed Date:** NA #### Release - **Description:** Launch and deploy all necessary nodes/tools/infrastructure - **Manager:** **Martin** - **Tasks:** - See Release Checklist (Add link to Issue) - **Target Date:** 04.11.19 - [ ] **Completed Date:** NA
non_process
rome live milestones live milestones the rome release plan contains a set of milestones with a target date for each experience have shown that we might not be able to follow these we may want to add or remove certain items we may want to adjust the scope of each milestone this approach like with tracking issues gives us some flexibility and will make it easier to keep track of our progress live milestone table live date event completed date rome announced na spec draft na produce sub system test checklist na perform sub system test na produce final test checklist na spec release na perform final test na produce release checklist na snapshot and kill acropolis na launch ready na release na milestones rome announced description announce the rome upcoming testnet with release plan and theme and website update tasks see target date completed date na spec draft description create and mark for review a pr for the rome specs tasks see see target date completed date na produce sub system test checklist description finalize and publish the requirements procedure and checklists for the perform sub system test tasks see target date completed date na perform sub system test description test all sub systems software separately on the staging testnets testnet s manager martin tasks see see sub system test checklist add link to issue target date completed date na note after this test only bugfix prs can be merged to master branch of relevant repos produce final test checklist description finalize and publish the requirements procedure and checklists for the perform final test manager martin tasks see target date completed date na spec release description release merge the pr for the rome specs manager bedeho tasks see see see target date completed date na perform final test description upgrade the staging testnets testnet runtime to rome and perform a full feature test manager martin tasks see see final test checklist add link to issue target date completed date na produce release checklist description finalize and publish the checklist for the release manager martin tasks see see release checklist add link to issue target date completed date na snapshot and kill acropolis description take a snapshot of the acropolis chain state and stop all jsgenesis hosted infrastructure manager martin tasks see see see release checklist add link to issue target date completed date na launch ready description prepare and finalize all communications website guides and infrastructure manager martin tasks see release checklist add link to issue target date completed date na release description launch and deploy all necessary nodes tools infrastructure manager martin tasks see release checklist add link to issue target date completed date na
0
61,344
3,144,707,336
IssuesEvent
2015-09-14 14:40:41
andresriancho/w3af
https://api.github.com/repos/andresriancho/w3af
closed
TypeError: gdbm key must be string, not unicode
bug gui priority:medium
## Version Information ``` Python version: 2.7.10 (default, Jul 13 2015, 12:05:58) [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] GTK version: 2.24.28 PyGTK version: 2.24.0 w3af version: w3af - Web Application Attack and Audit Framework Version: 1.7.6 Revision: 6e48526969 - 12 Ağu 2015 11:11 Branch: master Local changes: Yes Author: Andres Riancho and the w3af team. ``` ## Traceback ```pytb Traceback (most recent call last): File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 546, in _scan_director func() File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 645, in _scan_start self.set_tabs(True) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 792, in set_tabs self._set_tab(sensit, _("Results"), scanrun.ScanRunBody) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 798, in _set_tab newone = realWidget(self.w3af) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/scanrun.py", line 677, in __init__ kbbrowser = KBBrowser(w3af) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/scanrun.py", line 208, in __init__ super(KBBrowser, self).__init__(w3af, "pane-kbbrowser", 250) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/entries.py", line 943, in __init__ _RememberingPane.__init__(self, w3af, widgname, 0, defPos) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/entries.py", line 893, in __init__ widgname in self.winconfig File "/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/shelve.py", line 110, in __contains__ return key in self.dict TypeError: gdbm key must be string, not unicode ```
1.0
TypeError: gdbm key must be string, not unicode - ## Version Information ``` Python version: 2.7.10 (default, Jul 13 2015, 12:05:58) [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] GTK version: 2.24.28 PyGTK version: 2.24.0 w3af version: w3af - Web Application Attack and Audit Framework Version: 1.7.6 Revision: 6e48526969 - 12 Ağu 2015 11:11 Branch: master Local changes: Yes Author: Andres Riancho and the w3af team. ``` ## Traceback ```pytb Traceback (most recent call last): File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 546, in _scan_director func() File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 645, in _scan_start self.set_tabs(True) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 792, in set_tabs self._set_tab(sensit, _("Results"), scanrun.ScanRunBody) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/main.py", line 798, in _set_tab newone = realWidget(self.w3af) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/scanrun.py", line 677, in __init__ kbbrowser = KBBrowser(w3af) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/scanrun.py", line 208, in __init__ super(KBBrowser, self).__init__(w3af, "pane-kbbrowser", 250) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/entries.py", line 943, in __init__ _RememberingPane.__init__(self, w3af, widgname, 0, defPos) File "/Users/ismailkeskin/bin/Web Application Attack and Audit Framework/w3af/w3af/core/ui/gui/entries.py", line 893, in __init__ widgname in self.winconfig File "/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/shelve.py", line 110, in __contains__ return key in self.dict TypeError: gdbm key must be string, not unicode ```
non_process
typeerror gdbm key must be string not unicode version information python version default jul gtk version pygtk version version web application attack and audit framework version revision ağu branch master local changes yes author andres riancho and the team traceback pytb traceback most recent call last file users ismailkeskin bin web application attack and audit framework core ui gui main py line in scan director func file users ismailkeskin bin web application attack and audit framework core ui gui main py line in scan start self set tabs true file users ismailkeskin bin web application attack and audit framework core ui gui main py line in set tabs self set tab sensit results scanrun scanrunbody file users ismailkeskin bin web application attack and audit framework core ui gui main py line in set tab newone realwidget self file users ismailkeskin bin web application attack and audit framework core ui gui scanrun py line in init kbbrowser kbbrowser file users ismailkeskin bin web application attack and audit framework core ui gui scanrun py line in init super kbbrowser self init pane kbbrowser file users ismailkeskin bin web application attack and audit framework core ui gui entries py line in init rememberingpane init self widgname defpos file users ismailkeskin bin web application attack and audit framework core ui gui entries py line in init widgname in self winconfig file usr local cellar python frameworks python framework versions lib shelve py line in contains return key in self dict typeerror gdbm key must be string not unicode
0
677,612
23,167,734,880
IssuesEvent
2022-07-30 07:50:12
ObsidianMC/Obsidian
https://api.github.com/repos/ObsidianMC/Obsidian
closed
Implement new chat message/command changes (1.19 branch)
enhancement help wanted good first issue priority: high networking
As the title suggests I'm opening up this issue for someone who wants to do this before I get to it myself. Helpful Links: https://wiki.vg/images/f/f4/MinecraftChat.drawio4.png https://wiki.vg/Chat#Processing_chat Packets that should be looked at (Classes for these packets are created already) https://wiki.vg/Protocol#Chat_Message https://wiki.vg/Protocol#Player_Chat_Message https://wiki.vg/Protocol#Chat_Command This should work reliably with online mode enabled/disabled
1.0
Implement new chat message/command changes (1.19 branch) - As the title suggests I'm opening up this issue for someone who wants to do this before I get to it myself. Helpful Links: https://wiki.vg/images/f/f4/MinecraftChat.drawio4.png https://wiki.vg/Chat#Processing_chat Packets that should be looked at (Classes for these packets are created already) https://wiki.vg/Protocol#Chat_Message https://wiki.vg/Protocol#Player_Chat_Message https://wiki.vg/Protocol#Chat_Command This should work reliably with online mode enabled/disabled
non_process
implement new chat message command changes branch as the title suggests i m opening up this issue for someone who wants to do this before i get to it myself helpful links packets that should be looked at classes for these packets are created already this should work reliably with online mode enabled disabled
0
20,624
27,295,616,569
IssuesEvent
2023-02-23 20:04:00
googleapis/google-cloudevents-python
https://api.github.com/repos/googleapis/google-cloudevents-python
opened
Warning: a recent release failed
type: process
The following release PRs may have failed: * #187 - The release job was triggered, but has not reported back success. * #185 - The release job was triggered, but has not reported back success. * #183 - The release job was triggered, but has not reported back success. * #181 - The release job was triggered, but has not reported back success. * #136 - The release job was triggered, but has not reported back success.
1.0
Warning: a recent release failed - The following release PRs may have failed: * #187 - The release job was triggered, but has not reported back success. * #185 - The release job was triggered, but has not reported back success. * #183 - The release job was triggered, but has not reported back success. * #181 - The release job was triggered, but has not reported back success. * #136 - The release job was triggered, but has not reported back success.
process
warning a recent release failed the following release prs may have failed the release job was triggered but has not reported back success the release job was triggered but has not reported back success the release job was triggered but has not reported back success the release job was triggered but has not reported back success the release job was triggered but has not reported back success
1
117,270
15,085,387,511
IssuesEvent
2021-02-05 18:35:31
uswds/uswds-team
https://api.github.com/repos/uswds/uswds-team
closed
Valentine's- draft comms plan and graphic updates
outreach skill: communications skill: ux design
Pre-existing designed cards will be pushed socially for Valentine's Day: Feb 10, 11, and 12. Graphics need to be updated, comms plan developed. Materials for approval located [here ](https://drive.google.com/drive/folders/1aFmUfT7uTGuZu8EhMNtA13b5dMgRqHys?usp=sharing). posted for [review ](https://gsa-tts.slack.com/archives/C050HRGN7/p1612371472063400 ) 2/3/21 Tasks to complete per Ammie 1/28: - [x] Design: K.Carr to update existing cards. (name changes from design `standard` to `system`, add logo. no new cards need to be created. for the "violet" card, replace missing image w royalty free new version, use color tokens color instead of what is there now. Sync up on James w this as needed. - [x] Comms: Katie/Kathryn will create a comms plan that indicates platforms and includes copy associated with posts on Feb, 10, 11, 12. Comms copy should link to site or video content, or link to the component. For example, the color card could link to a video that Dan had previously done on color tokens. - [x] approval: EOD Tuesday 2/2 or early Wed morning 2/3 so that it can go into comms approval process. DoD - [x] all tasks complete and delivered to Ammie - [x] Comms plan execution issue created #61
1.0
Valentine's- draft comms plan and graphic updates - Pre-existing designed cards will be pushed socially for Valentine's Day: Feb 10, 11, and 12. Graphics need to be updated, comms plan developed. Materials for approval located [here ](https://drive.google.com/drive/folders/1aFmUfT7uTGuZu8EhMNtA13b5dMgRqHys?usp=sharing). posted for [review ](https://gsa-tts.slack.com/archives/C050HRGN7/p1612371472063400 ) 2/3/21 Tasks to complete per Ammie 1/28: - [x] Design: K.Carr to update existing cards. (name changes from design `standard` to `system`, add logo. no new cards need to be created. for the "violet" card, replace missing image w royalty free new version, use color tokens color instead of what is there now. Sync up on James w this as needed. - [x] Comms: Katie/Kathryn will create a comms plan that indicates platforms and includes copy associated with posts on Feb, 10, 11, 12. Comms copy should link to site or video content, or link to the component. For example, the color card could link to a video that Dan had previously done on color tokens. - [x] approval: EOD Tuesday 2/2 or early Wed morning 2/3 so that it can go into comms approval process. DoD - [x] all tasks complete and delivered to Ammie - [x] Comms plan execution issue created #61
non_process
valentine s draft comms plan and graphic updates pre existing designed cards will be pushed socially for valentine s day feb and graphics need to be updated comms plan developed materials for approval located posted for tasks to complete per ammie design k carr to update existing cards name changes from design standard to system add logo no new cards need to be created for the violet card replace missing image w royalty free new version use color tokens color instead of what is there now sync up on james w this as needed comms katie kathryn will create a comms plan that indicates platforms and includes copy associated with posts on feb comms copy should link to site or video content or link to the component for example the color card could link to a video that dan had previously done on color tokens approval eod tuesday or early wed morning so that it can go into comms approval process dod all tasks complete and delivered to ammie comms plan execution issue created
0
45,430
13,112,516,799
IssuesEvent
2020-08-05 02:27:00
ElliotChen/spring_boot_example
https://api.github.com/repos/ElliotChen/spring_boot_example
opened
CVE-2019-8331 (Medium) detected in bootstrap-3.3.6.min.js, bootstrap-3.3.6.js
security vulnerability
## CVE-2019-8331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.6.min.js</b>, <b>bootstrap-3.3.6.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.6.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: /spring_boot_example/08adminlte/src/main/resources/static/bootstrap/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.6.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.6.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p> <p>Path to vulnerable library: /spring_boot_example/08adminlte/src/main/resources/static/bootstrap/js/bootstrap.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.6.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ElliotChen/spring_boot_example/commit/a17305aeab18750ac1778b2cf27d7352a737ba33">a17305aeab18750ac1778b2cf27d7352a737ba33</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute. <p>Publish Date: 2019-02-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p> <p>Release Date: 2019-02-20</p> <p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-8331 (Medium) detected in bootstrap-3.3.6.min.js, bootstrap-3.3.6.js - ## CVE-2019-8331 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.6.min.js</b>, <b>bootstrap-3.3.6.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.6.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: /spring_boot_example/08adminlte/src/main/resources/static/bootstrap/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.6.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.6.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p> <p>Path to vulnerable library: /spring_boot_example/08adminlte/src/main/resources/static/bootstrap/js/bootstrap.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.6.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/ElliotChen/spring_boot_example/commit/a17305aeab18750ac1778b2cf27d7352a737ba33">a17305aeab18750ac1778b2cf27d7352a737ba33</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute. <p>Publish Date: 2019-02-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p> <p>Release Date: 2019-02-20</p> <p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in bootstrap min js bootstrap js cve medium severity vulnerability vulnerable libraries bootstrap min js bootstrap js bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library spring boot example src main resources static bootstrap js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library spring boot example src main resources static bootstrap js bootstrap js dependency hierarchy x bootstrap js vulnerable library found in head commit a href vulnerability details in bootstrap before and x before xss is possible in the tooltip or popover data template attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap bootstrap sass step up your open source security game with whitesource
0
5,231
8,030,837,037
IssuesEvent
2018-07-27 21:16:05
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
bold and italic markup is sometimes lost in index terms
bug postprocessing
When converting ```tex \documentclass{article} \begin{document} \textbf{BOLD\index{NORMAL}} \textbf{BOLD\index{\textbf{BOLD}}} \textbf{BOLD\index{\textit{ITALIC}}} \textit{ITALIC\index{NORMAL}} \textit{ITALIC\index{\textbf{BOLD}}} \textit{ITALIC\index{\textit{ITALIC}}} {NORMAL\index{NORMAL}} {NORMAL\index{\textbf{BOLD}}} {NORMAL\index{\textit{ITALIC}}} \end{document} ``` with LaTeXML version 0.8.2; revision 48de2c4 ```sh latexml --nocomments test.tex ``` I get ```xml [...] <para xml:id="p1"> <p><text font="bold">BOLD<indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></text></p> </para> <para xml:id="p2"> <p><text font="bold">BOLD<indexmark> <indexphrase key="BOLD">BOLD</indexphrase> </indexmark></text></p> </para> <para xml:id="p3"> <p><text font="bold">BOLD<indexmark> <indexphrase key="ITALIC"><text font="italic">ITALIC</text></indexphrase> </indexmark></text></p> </para> <para xml:id="p4"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></text></p> </para> <para xml:id="p5"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> </indexmark></text></p> </para> <para xml:id="p6"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="ITALIC">ITALIC</indexphrase> </indexmark></text></p> </para> <para xml:id="p7"> <p>NORMAL <indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></p> </para> <para xml:id="p8"> <p>NORMAL <indexmark> <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> </indexmark></p> </para> <para xml:id="p9"> <p>NORMAL <indexmark> <indexphrase key="ITALIC"><text font="italic">ITALIC</text></indexphrase> </indexmark></p> </para> [...] ``` In `<para xml:id="p2">` we have ```xml <indexphrase key="BOLD">BOLD</indexphrase> ``` while in `<para xml:id="p5">` and `<para xml:id="p8">` we have ```xml <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> ``` Also how would one distinguish `<para xml:id="p1">` from `<para xml:id="p2">`? I am not sure what the expected behaviour would be. Depending on whether `text[@font='bold']/indexmark/indexphrase/text()` is supposed to inherit the boldness from the `text` ancestor I would expect ```xml <indexphrase key="NORMAL"><text font="medium">NORMAL</text></indexphrase> ``` in `<para xml:id="p1">` or ```xml <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> ``` in `<para xml:id="p2">`. italic markup shows an analog problem. I have not tested with `\emph{}`
1.0
bold and italic markup is sometimes lost in index terms - When converting ```tex \documentclass{article} \begin{document} \textbf{BOLD\index{NORMAL}} \textbf{BOLD\index{\textbf{BOLD}}} \textbf{BOLD\index{\textit{ITALIC}}} \textit{ITALIC\index{NORMAL}} \textit{ITALIC\index{\textbf{BOLD}}} \textit{ITALIC\index{\textit{ITALIC}}} {NORMAL\index{NORMAL}} {NORMAL\index{\textbf{BOLD}}} {NORMAL\index{\textit{ITALIC}}} \end{document} ``` with LaTeXML version 0.8.2; revision 48de2c4 ```sh latexml --nocomments test.tex ``` I get ```xml [...] <para xml:id="p1"> <p><text font="bold">BOLD<indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></text></p> </para> <para xml:id="p2"> <p><text font="bold">BOLD<indexmark> <indexphrase key="BOLD">BOLD</indexphrase> </indexmark></text></p> </para> <para xml:id="p3"> <p><text font="bold">BOLD<indexmark> <indexphrase key="ITALIC"><text font="italic">ITALIC</text></indexphrase> </indexmark></text></p> </para> <para xml:id="p4"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></text></p> </para> <para xml:id="p5"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> </indexmark></text></p> </para> <para xml:id="p6"> <p><text font="italic">ITALIC<indexmark> <indexphrase key="ITALIC">ITALIC</indexphrase> </indexmark></text></p> </para> <para xml:id="p7"> <p>NORMAL <indexmark> <indexphrase key="NORMAL">NORMAL</indexphrase> </indexmark></p> </para> <para xml:id="p8"> <p>NORMAL <indexmark> <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> </indexmark></p> </para> <para xml:id="p9"> <p>NORMAL <indexmark> <indexphrase key="ITALIC"><text font="italic">ITALIC</text></indexphrase> </indexmark></p> </para> [...] ``` In `<para xml:id="p2">` we have ```xml <indexphrase key="BOLD">BOLD</indexphrase> ``` while in `<para xml:id="p5">` and `<para xml:id="p8">` we have ```xml <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> ``` Also how would one distinguish `<para xml:id="p1">` from `<para xml:id="p2">`? I am not sure what the expected behaviour would be. Depending on whether `text[@font='bold']/indexmark/indexphrase/text()` is supposed to inherit the boldness from the `text` ancestor I would expect ```xml <indexphrase key="NORMAL"><text font="medium">NORMAL</text></indexphrase> ``` in `<para xml:id="p1">` or ```xml <indexphrase key="BOLD"><text font="bold">BOLD</text></indexphrase> ``` in `<para xml:id="p2">`. italic markup shows an analog problem. I have not tested with `\emph{}`
process
bold and italic markup is sometimes lost in index terms when converting tex documentclass article begin document textbf bold index normal textbf bold index textbf bold textbf bold index textit italic textit italic index normal textit italic index textbf bold textit italic index textit italic normal index normal normal index textbf bold normal index textit italic end document with latexml version revision sh latexml nocomments test tex i get xml bold normal bold bold bold italic italic normal italic bold italic italic normal normal normal bold normal italic in we have xml bold while in and we have xml bold also how would one distinguish from i am not sure what the expected behaviour would be depending on whether text indexmark indexphrase text is supposed to inherit the boldness from the text ancestor i would expect xml normal in or xml bold in italic markup shows an analog problem i have not tested with emph
1
264,540
23,123,697,585
IssuesEvent
2022-07-28 01:51:57
milvus-io/milvus
https://api.github.com/repos/milvus-io/milvus
closed
[Bug]: [benchmark][cluster]Milvus 2 replicas concurrently with 2 clients, search, query, load latency becomes higher
kind/bug priority/critical-urgent needs-triage test/benchmark
### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version:2.1.0-20220723-1e038a75 - Deployment mode(standalone or cluster): cluster - SDK version(e.g. pymilvus v2.0.0rc2):pymilvus 2.1.0dev103 - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior server-instance fouram-tag-no-clean-xrq2m-1 server-configmap server-cluster-8c64m-querynode2 client-configmap client-random-locust-search-filter-100m-ddl-r8-w2-rep2-36h ``` fouram-tag-no-clean-xrq2m-1-etcd-0 1/1 Running 0 72m 10.104.6.42 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-etcd-1 1/1 Running 0 73m 10.104.5.6 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-etcd-2 1/1 Running 0 74m 10.104.4.17 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-datacoord-586955f8cd-9gvlc 1/1 Running 0 42h 10.104.6.112 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-datanode-796458bcf5-r9m2x 1/1 Running 0 42h 10.104.6.111 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-indexcoord-5cf55b44c9-whgp2 1/1 Running 0 42h 10.104.1.91 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-indexnode-9d6cfc9c8-h47x6 1/1 Running 0 42h 10.104.4.47 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-proxy-dc78686c-rjknq 1/1 Running 0 42h 10.104.1.96 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querycoord-865599ff4d-nlxl8 1/1 Running 0 42h 10.104.6.114 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querynode-668fc4776-bh5gl 1/1 Running 0 42h 10.104.1.94 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querynode-668fc4776-llbxp 1/1 Running 0 42h 10.104.9.225 4am-node14 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-rootcoord-5f8cfc9dbb-l7rcd 1/1 Running 0 42h 10.104.1.90 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-0 1/1 Running 0 42h 10.104.6.118 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-1 1/1 Running 0 42h 10.104.5.200 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-2 1/1 Running 0 42h 10.104.4.52 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-3 1/1 Running 0 42h 10.104.9.227 4am-node14 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-0 1/1 Running 0 42h 10.104.6.120 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-1 1/1 Running 0 42h 10.104.4.53 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-2 1/1 Running 0 42h 10.104.5.205 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-broker-0 1/1 Running 0 42h 10.104.4.48 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-proxy-0 1/1 Running 0 42h 10.104.1.95 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-recovery-0 1/1 Running 0 42h 10.104.6.113 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-0 1/1 Running 0 42h 10.104.1.98 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-1 1/1 Running 0 42h 10.104.6.123 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-2 1/1 Running 0 42h 10.104.4.58 4am-node11 <none> <none> ``` search latency: <img width="1391" alt="截屏2022-07-25 16 33 33" src="https://user-images.githubusercontent.com/34296482/180734104-89e84e90-c44f-40ea-9ca6-6657abd41eba.png"> query: <img width="1382" alt="截屏2022-07-25 16 33 48" src="https://user-images.githubusercontent.com/34296482/180734152-e6a7a1f6-c773-4d46-bd80-de832ad1adb7.png"> load: <img width="1345" alt="截屏2022-07-25 16 34 35" src="https://user-images.githubusercontent.com/34296482/180734247-cbdc4f75-14ef-4971-8297-e9b48e08bdc3.png"> server-instance fouram-tag-no-clean-xrq2m-1 server-configmap server-cluster-8c64m-querynode2 client-configmap client-random-locust-search-filter-100m-ddl-r8-w2-replica2-con uses 2 clients search: <img width="1373" alt="截屏2022-07-25 16 35 53" src="https://user-images.githubusercontent.com/34296482/180734639-31bd292f-b0d3-41e8-bfab-9d5e1c88a582.png"> query: <img width="1382" alt="截屏2022-07-25 16 36 01" src="https://user-images.githubusercontent.com/34296482/180734680-3ddad331-aa4a-4240-9913-0880942a1e61.png"> load: <img width="1346" alt="截屏2022-07-25 16 36 10" src="https://user-images.githubusercontent.com/34296482/180734714-4915813d-60f4-4466-8dae-00beb14df1ec.png"> ### Expected Behavior _No response_ ### Steps To Reproduce _No response_ ### Milvus Log _No response_ ### Anything else? _No response_
1.0
[Bug]: [benchmark][cluster]Milvus 2 replicas concurrently with 2 clients, search, query, load latency becomes higher - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version:2.1.0-20220723-1e038a75 - Deployment mode(standalone or cluster): cluster - SDK version(e.g. pymilvus v2.0.0rc2):pymilvus 2.1.0dev103 - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior server-instance fouram-tag-no-clean-xrq2m-1 server-configmap server-cluster-8c64m-querynode2 client-configmap client-random-locust-search-filter-100m-ddl-r8-w2-rep2-36h ``` fouram-tag-no-clean-xrq2m-1-etcd-0 1/1 Running 0 72m 10.104.6.42 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-etcd-1 1/1 Running 0 73m 10.104.5.6 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-etcd-2 1/1 Running 0 74m 10.104.4.17 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-datacoord-586955f8cd-9gvlc 1/1 Running 0 42h 10.104.6.112 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-datanode-796458bcf5-r9m2x 1/1 Running 0 42h 10.104.6.111 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-indexcoord-5cf55b44c9-whgp2 1/1 Running 0 42h 10.104.1.91 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-indexnode-9d6cfc9c8-h47x6 1/1 Running 0 42h 10.104.4.47 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-proxy-dc78686c-rjknq 1/1 Running 0 42h 10.104.1.96 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querycoord-865599ff4d-nlxl8 1/1 Running 0 42h 10.104.6.114 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querynode-668fc4776-bh5gl 1/1 Running 0 42h 10.104.1.94 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-querynode-668fc4776-llbxp 1/1 Running 0 42h 10.104.9.225 4am-node14 <none> <none> fouram-tag-no-clean-xrq2m-1-milvus-rootcoord-5f8cfc9dbb-l7rcd 1/1 Running 0 42h 10.104.1.90 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-0 1/1 Running 0 42h 10.104.6.118 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-1 1/1 Running 0 42h 10.104.5.200 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-2 1/1 Running 0 42h 10.104.4.52 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-minio-3 1/1 Running 0 42h 10.104.9.227 4am-node14 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-0 1/1 Running 0 42h 10.104.6.120 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-1 1/1 Running 0 42h 10.104.4.53 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-bookie-2 1/1 Running 0 42h 10.104.5.205 4am-node12 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-broker-0 1/1 Running 0 42h 10.104.4.48 4am-node11 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-proxy-0 1/1 Running 0 42h 10.104.1.95 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-recovery-0 1/1 Running 0 42h 10.104.6.113 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-0 1/1 Running 0 42h 10.104.1.98 4am-node10 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-1 1/1 Running 0 42h 10.104.6.123 4am-node13 <none> <none> fouram-tag-no-clean-xrq2m-1-pulsar-zookeeper-2 1/1 Running 0 42h 10.104.4.58 4am-node11 <none> <none> ``` search latency: <img width="1391" alt="截屏2022-07-25 16 33 33" src="https://user-images.githubusercontent.com/34296482/180734104-89e84e90-c44f-40ea-9ca6-6657abd41eba.png"> query: <img width="1382" alt="截屏2022-07-25 16 33 48" src="https://user-images.githubusercontent.com/34296482/180734152-e6a7a1f6-c773-4d46-bd80-de832ad1adb7.png"> load: <img width="1345" alt="截屏2022-07-25 16 34 35" src="https://user-images.githubusercontent.com/34296482/180734247-cbdc4f75-14ef-4971-8297-e9b48e08bdc3.png"> server-instance fouram-tag-no-clean-xrq2m-1 server-configmap server-cluster-8c64m-querynode2 client-configmap client-random-locust-search-filter-100m-ddl-r8-w2-replica2-con uses 2 clients search: <img width="1373" alt="截屏2022-07-25 16 35 53" src="https://user-images.githubusercontent.com/34296482/180734639-31bd292f-b0d3-41e8-bfab-9d5e1c88a582.png"> query: <img width="1382" alt="截屏2022-07-25 16 36 01" src="https://user-images.githubusercontent.com/34296482/180734680-3ddad331-aa4a-4240-9913-0880942a1e61.png"> load: <img width="1346" alt="截屏2022-07-25 16 36 10" src="https://user-images.githubusercontent.com/34296482/180734714-4915813d-60f4-4466-8dae-00beb14df1ec.png"> ### Expected Behavior _No response_ ### Steps To Reproduce _No response_ ### Milvus Log _No response_ ### Anything else? _No response_
non_process
milvus replicas concurrently with clients search query load latency becomes higher is there an existing issue for this i have searched the existing issues environment markdown milvus version deployment mode standalone or cluster cluster sdk version e g pymilvus pymilvus os ubuntu or centos cpu memory gpu others current behavior server instance fouram tag no clean server configmap server cluster client configmap client random locust search filter ddl fouram tag no clean etcd running fouram tag no clean etcd running fouram tag no clean etcd running fouram tag no clean milvus datacoord running fouram tag no clean milvus datanode running fouram tag no clean milvus indexcoord running fouram tag no clean milvus indexnode running fouram tag no clean milvus proxy rjknq running fouram tag no clean milvus querycoord running fouram tag no clean milvus querynode running fouram tag no clean milvus querynode llbxp running fouram tag no clean milvus rootcoord running fouram tag no clean minio running fouram tag no clean minio running fouram tag no clean minio running fouram tag no clean minio running fouram tag no clean pulsar bookie running fouram tag no clean pulsar bookie running fouram tag no clean pulsar bookie running fouram tag no clean pulsar broker running fouram tag no clean pulsar proxy running fouram tag no clean pulsar recovery running fouram tag no clean pulsar zookeeper running fouram tag no clean pulsar zookeeper running fouram tag no clean pulsar zookeeper running search latency: img width alt src query: img width alt src load: img width alt src server instance fouram tag no clean server configmap server cluster client configmap client random locust search filter ddl con uses clients search: img width alt src query: img width alt src load: img width alt src expected behavior no response steps to reproduce no response milvus log no response anything else no response
0
20,903
27,742,619,298
IssuesEvent
2023-03-15 15:10:35
toggl/track-windows-feedback
https://api.github.com/repos/toggl/track-windows-feedback
closed
Memory leak related to the UI (300-850MB+ RAM usage)
bug processed
**Describe the bug** RAM isn't being flushed. `webFrame.clearCache()` every screen change might solve. **Steps to reproduce** Moving around the UI fast and repeatedly makes the RAM usage increase and never goes down to its base level of about 130MB. - About 5 GB of free system RAM when I was running, so, probably any automatic garbage collection would not kick in. - If you switch between the `List` and `Calendar` tab, vigorously repeatedly, by the human hand, does the most damage. - Also, it got stuck eventually saying "Syncing" on top when I reached 850MB and just gave up on clicking anymore, after that, it became unresponsive, I still had enough free system RAM, and I had to force close it. Perhaps, unrelated to RAM usage? - Just closing the UI into the system tray has no effect to recover the RAM usage. **Expected behavior** The older `(Version 7.5.454)` app doesn't seem to use more than about 100MB under any circumstances (from its base level of about 75MB when it starts up fresh), it tend to recover fast after the usage a bit increases. **Possible Solution:** Running the open source software (avaliable on github) [`Mem Reduct`](https://github.com/henrypp/memreduct) recovered the RAM, of both the old application (down to 30MB from 70MB from base level even), and also of every single Chrome/Electron Instance running on my system (Google Chrome & VScode). I believe your application uses Electron, if I'm not mistaken, so it explains the behaviour of the RAM usage. Perhaps if there is a way to force an electron instance to clean up, like how that `Mem Reduct` app does system-wide. From what I found online using `webFrame.clearCache()` on every screen change on single page apps flushes it out. So, I believe there is hope to easily fix it, since it did somehow flush with `Mem Reduct`. Maybe even running `webFrame.clearCache()` every few seconds as well might be a good hack to mitigate RAM usage in a indiscriminate fashion throughout the whole application, but I don't know if it is common practice and its side-effects. Here is a comprehensive article on it: [DEBUGGING ELECTRON MEMORY USAGE](https://seenaburns.com/debugging-electron-memory-usage/) **Environment (please complete the following information):** - Application Version: `8.0.4 (64-bit)` - Windows 10 Enterprise, 10.0.19043 Build 19043 - RAM 12GB (5GB free when running the test) - Processor Intel Xeon E3-1220 v3 @ 3.10GHz (4x) - GPU: GTX 650 1GB (200Mb free when running the test) - Display: 4k, DPI 150%
1.0
Memory leak related to the UI (300-850MB+ RAM usage) - **Describe the bug** RAM isn't being flushed. `webFrame.clearCache()` every screen change might solve. **Steps to reproduce** Moving around the UI fast and repeatedly makes the RAM usage increase and never goes down to its base level of about 130MB. - About 5 GB of free system RAM when I was running, so, probably any automatic garbage collection would not kick in. - If you switch between the `List` and `Calendar` tab, vigorously repeatedly, by the human hand, does the most damage. - Also, it got stuck eventually saying "Syncing" on top when I reached 850MB and just gave up on clicking anymore, after that, it became unresponsive, I still had enough free system RAM, and I had to force close it. Perhaps, unrelated to RAM usage? - Just closing the UI into the system tray has no effect to recover the RAM usage. **Expected behavior** The older `(Version 7.5.454)` app doesn't seem to use more than about 100MB under any circumstances (from its base level of about 75MB when it starts up fresh), it tend to recover fast after the usage a bit increases. **Possible Solution:** Running the open source software (avaliable on github) [`Mem Reduct`](https://github.com/henrypp/memreduct) recovered the RAM, of both the old application (down to 30MB from 70MB from base level even), and also of every single Chrome/Electron Instance running on my system (Google Chrome & VScode). I believe your application uses Electron, if I'm not mistaken, so it explains the behaviour of the RAM usage. Perhaps if there is a way to force an electron instance to clean up, like how that `Mem Reduct` app does system-wide. From what I found online using `webFrame.clearCache()` on every screen change on single page apps flushes it out. So, I believe there is hope to easily fix it, since it did somehow flush with `Mem Reduct`. Maybe even running `webFrame.clearCache()` every few seconds as well might be a good hack to mitigate RAM usage in a indiscriminate fashion throughout the whole application, but I don't know if it is common practice and its side-effects. Here is a comprehensive article on it: [DEBUGGING ELECTRON MEMORY USAGE](https://seenaburns.com/debugging-electron-memory-usage/) **Environment (please complete the following information):** - Application Version: `8.0.4 (64-bit)` - Windows 10 Enterprise, 10.0.19043 Build 19043 - RAM 12GB (5GB free when running the test) - Processor Intel Xeon E3-1220 v3 @ 3.10GHz (4x) - GPU: GTX 650 1GB (200Mb free when running the test) - Display: 4k, DPI 150%
process
memory leak related to the ui ram usage describe the bug ram isn t being flushed webframe clearcache every screen change might solve steps to reproduce moving around the ui fast and repeatedly makes the ram usage increase and never goes down to its base level of about about gb of free system ram when i was running so probably any automatic garbage collection would not kick in if you switch between the list and calendar tab vigorously repeatedly by the human hand does the most damage also it got stuck eventually saying syncing on top when i reached and just gave up on clicking anymore after that it became unresponsive i still had enough free system ram and i had to force close it perhaps unrelated to ram usage just closing the ui into the system tray has no effect to recover the ram usage expected behavior the older version app doesn t seem to use more than about under any circumstances from its base level of about when it starts up fresh it tend to recover fast after the usage a bit increases possible solution running the open source software avaliable on github recovered the ram of both the old application down to from from base level even and also of every single chrome electron instance running on my system google chrome vscode i believe your application uses electron if i m not mistaken so it explains the behaviour of the ram usage perhaps if there is a way to force an electron instance to clean up like how that mem reduct app does system wide from what i found online using webframe clearcache on every screen change on single page apps flushes it out so i believe there is hope to easily fix it since it did somehow flush with mem reduct maybe even running webframe clearcache every few seconds as well might be a good hack to mitigate ram usage in a indiscriminate fashion throughout the whole application but i don t know if it is common practice and its side effects here is a comprehensive article on it environment please complete the following information application version bit windows enterprise build ram free when running the test processor intel xeon gpu gtx free when running the test display dpi
1
36,347
7,894,198,096
IssuesEvent
2018-06-28 20:38:24
pyroscope/rtorrent-ps
https://api.github.com/repos/pyroscope/rtorrent-ps
closed
Possible defect in xmlrpc logging or log.xmlrpc re-open
P1 defect in progress
Recently, had fragments of XMLRPC payloads in a file on disk, which hints at memory overwrites ‘somewhere’. ``` ………………ponse> <params> <param><value><i4>0</i4></value></param> </params> </methodResponse> --- ``` The problem is new since a few days. A *recent* related change was https://github.com/pyroscope/pimp-my-box/commit/d7a13a7f257e7cfd437f09ee8ade1d5651865df2. A functional difference is the new code closes (on startup) the still closed xmlrpc log, then opens it later on (in `_rtlocal.rc`).. Still needs further watching and investigation.
1.0
Possible defect in xmlrpc logging or log.xmlrpc re-open - Recently, had fragments of XMLRPC payloads in a file on disk, which hints at memory overwrites ‘somewhere’. ``` ………………ponse> <params> <param><value><i4>0</i4></value></param> </params> </methodResponse> --- ``` The problem is new since a few days. A *recent* related change was https://github.com/pyroscope/pimp-my-box/commit/d7a13a7f257e7cfd437f09ee8ade1d5651865df2. A functional difference is the new code closes (on startup) the still closed xmlrpc log, then opens it later on (in `_rtlocal.rc`).. Still needs further watching and investigation.
non_process
possible defect in xmlrpc logging or log xmlrpc re open recently had fragments of xmlrpc payloads in a file on disk which hints at memory overwrites ‘somewhere’ ………………ponse the problem is new since a few days a recent related change was a functional difference is the new code closes on startup the still closed xmlrpc log then opens it later on in rtlocal rc still needs further watching and investigation
0
7,694
10,780,331,740
IssuesEvent
2019-11-04 12:45:17
prisma/photonjs
https://api.github.com/repos/prisma/photonjs
opened
Implement batching protocol
process/candidate
As described in https://github.com/prisma/specs/issues/242#issuecomment-548382431 This issue only tracks the implementation of the batching protocol both on the query-engine and Photon.js side. The underlying optimization of "merging multiple findOne" queries is a separate issue.
1.0
Implement batching protocol - As described in https://github.com/prisma/specs/issues/242#issuecomment-548382431 This issue only tracks the implementation of the batching protocol both on the query-engine and Photon.js side. The underlying optimization of "merging multiple findOne" queries is a separate issue.
process
implement batching protocol as described in this issue only tracks the implementation of the batching protocol both on the query engine and photon js side the underlying optimization of merging multiple findone queries is a separate issue
1
166,883
20,725,611,305
IssuesEvent
2022-03-14 01:14:25
jgeraigery/FHIR
https://api.github.com/repos/jgeraigery/FHIR
opened
CVE-2021-37713 (High) detected in tar-2.2.2.tgz
security vulnerability
## CVE-2021-37713 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.2.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.2.tgz">https://registry.npmjs.org/tar/-/tar-2.2.2.tgz</a></p> <p>Path to dependency file: /docs/package.json</p> <p>Path to vulnerable library: /docs/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - gatsby-theme-carbon-1.24.3.tgz (Root Library) - node-sass-4.14.1.tgz - node-gyp-3.8.0.tgz - :x: **tar-2.2.2.tgz** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\path`. If the drive letter does not match the extraction target, for example `D:\extraction\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713>CVE-2021-37713</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh">https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution (tar): 4.4.18</p> <p>Direct dependency fix Resolution (gatsby-theme-carbon): 2.0.0-next.3</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"gatsby-theme-carbon","packageVersion":"1.24.3","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":false,"dependencyTree":"gatsby-theme-carbon:1.24.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.0.0-next.3","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-37713","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\\path`. If the drive letter does not match the extraction target, for example `D:\\extraction\\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Changed","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-37713 (High) detected in tar-2.2.2.tgz - ## CVE-2021-37713 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.2.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.2.tgz">https://registry.npmjs.org/tar/-/tar-2.2.2.tgz</a></p> <p>Path to dependency file: /docs/package.json</p> <p>Path to vulnerable library: /docs/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - gatsby-theme-carbon-1.24.3.tgz (Root Library) - node-sass-4.14.1.tgz - node-gyp-3.8.0.tgz - :x: **tar-2.2.2.tgz** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\path`. If the drive letter does not match the extraction target, for example `D:\extraction\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713>CVE-2021-37713</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh">https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution (tar): 4.4.18</p> <p>Direct dependency fix Resolution (gatsby-theme-carbon): 2.0.0-next.3</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"gatsby-theme-carbon","packageVersion":"1.24.3","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":false,"dependencyTree":"gatsby-theme-carbon:1.24.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.0.0-next.3","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-37713","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\\path`. If the drive letter does not match the extraction target, for example `D:\\extraction\\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Changed","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file docs package json path to vulnerable library docs node modules tar package json dependency hierarchy gatsby theme carbon tgz root library node sass tgz node gyp tgz x tar tgz vulnerable library found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted this is in part accomplished by sanitizing absolute paths of entries within the archive skipping archive entries that contain path portions and resolving the sanitized paths against the extraction target directory this logic was insufficient on windows systems when extracting tar files that contained a path that was not an absolute path but specified a drive letter different from the extraction target such as c some path if the drive letter does not match the extraction target for example d extraction dir then the result of path resolve extractiondirectory entrypath would resolve against the current working directory on the c drive rather than the extraction target directory additionally a portion of the path could occur immediately after the drive letter such as c foo and was not properly sanitized by the logic that checked for within the normalized and split portions of the path this only affects users of node tar on windows systems these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar there is no reasonable way to work around this issue without performing the same path normalization procedures that node tar now does users are encouraged to upgrade to the latest patched versions of node tar rather than attempt to sanitize paths themselves publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution gatsby theme carbon next rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree gatsby theme carbon isminimumfixversionavailable true minimumfixversion next isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted this is in part accomplished by sanitizing absolute paths of entries within the archive skipping archive entries that contain path portions and resolving the sanitized paths against the extraction target directory this logic was insufficient on windows systems when extracting tar files that contained a path that was not an absolute path but specified a drive letter different from the extraction target such as c some path if the drive letter does not match the extraction target for example d extraction dir then the result of path resolve extractiondirectory entrypath would resolve against the current working directory on the c drive rather than the extraction target directory additionally a portion of the path could occur immediately after the drive letter such as c foo and was not properly sanitized by the logic that checked for within the normalized and split portions of the path this only affects users of node tar on windows systems these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar there is no reasonable way to work around this issue without performing the same path normalization procedures that node tar now does users are encouraged to upgrade to the latest patched versions of node tar rather than attempt to sanitize paths themselves vulnerabilityurl
0
49,959
13,187,300,122
IssuesEvent
2020-08-13 02:58:36
icecube-trac/tix3
https://api.github.com/repos/icecube-trac/tix3
opened
[docs] Missing `cd ../build` on "Parasitic metaprojects" documentation (Trac #2372)
Incomplete Migration Migrated from Trac cmake defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2372">https://code.icecube.wisc.edu/ticket/2372</a>, reported by icecube and owned by dgillcrist</em></summary> <p> ```json { "status": "closed", "changetime": "2020-06-24T12:31:42", "description": "The documentation located at: http://software.icecube.wisc.edu/documentation/projects/cmake/parasite.html, has a missing command in the tutorial. There should be a line `../build` right before `svn co http://code.icecube.wisc.edu/svn/projects/simclasses/trunk ../src/simclasses` and right after `make rebuild_cache`", "reporter": "icecube", "cc": "", "resolution": "fixed", "_ts": "1593001902142004", "component": "cmake", "summary": "[docs] Missing `cd ../build` on \"Parasitic metaprojects\" documentation", "priority": "minor", "keywords": "parasitic, metaproject", "time": "2019-11-06T20:55:29", "milestone": "Autumnal Equinox 2020", "owner": "dgillcrist", "type": "defect" } ``` </p> </details>
1.0
[docs] Missing `cd ../build` on "Parasitic metaprojects" documentation (Trac #2372) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2372">https://code.icecube.wisc.edu/ticket/2372</a>, reported by icecube and owned by dgillcrist</em></summary> <p> ```json { "status": "closed", "changetime": "2020-06-24T12:31:42", "description": "The documentation located at: http://software.icecube.wisc.edu/documentation/projects/cmake/parasite.html, has a missing command in the tutorial. There should be a line `../build` right before `svn co http://code.icecube.wisc.edu/svn/projects/simclasses/trunk ../src/simclasses` and right after `make rebuild_cache`", "reporter": "icecube", "cc": "", "resolution": "fixed", "_ts": "1593001902142004", "component": "cmake", "summary": "[docs] Missing `cd ../build` on \"Parasitic metaprojects\" documentation", "priority": "minor", "keywords": "parasitic, metaproject", "time": "2019-11-06T20:55:29", "milestone": "Autumnal Equinox 2020", "owner": "dgillcrist", "type": "defect" } ``` </p> </details>
non_process
missing cd build on parasitic metaprojects documentation trac migrated from json status closed changetime description the documentation located at has a missing command in the tutorial there should be a line build right before svn co src simclasses and right after make rebuild cache reporter icecube cc resolution fixed ts component cmake summary missing cd build on parasitic metaprojects documentation priority minor keywords parasitic metaproject time milestone autumnal equinox owner dgillcrist type defect
0
569,246
17,009,313,495
IssuesEvent
2021-07-02 00:07:49
waldronlab/BugSigDB
https://api.github.com/repos/waldronlab/BugSigDB
closed
Reproducible installation of bugsigdb.org
major priority
Want a containerized installation or instructions for otherwise easily reproducible installation of the full bugsigdb.org.
1.0
Reproducible installation of bugsigdb.org - Want a containerized installation or instructions for otherwise easily reproducible installation of the full bugsigdb.org.
non_process
reproducible installation of bugsigdb org want a containerized installation or instructions for otherwise easily reproducible installation of the full bugsigdb org
0
442,779
30,856,216,876
IssuesEvent
2023-08-02 20:51:25
gravitational/teleport
https://api.github.com/repos/gravitational/teleport
opened
Add a prominent link to the Teleport labs within the docs
documentation
## Applies To The `/docs/` landing page, probably. ## Details The is a request from Developer Relations. Add a link to the [Teleport labs](https://goteleport.com/labs/) from the docs so readers can try Teleport without spinning up their own environment. ## How will we know this is resolved? There is a link to the Teleport Labs page from the docs. ## Related Issues <!-- Please make an effort to find related issues on GitHub and list them here. This makes a big difference in how we prioritize and schedule work. -->
1.0
Add a prominent link to the Teleport labs within the docs - ## Applies To The `/docs/` landing page, probably. ## Details The is a request from Developer Relations. Add a link to the [Teleport labs](https://goteleport.com/labs/) from the docs so readers can try Teleport without spinning up their own environment. ## How will we know this is resolved? There is a link to the Teleport Labs page from the docs. ## Related Issues <!-- Please make an effort to find related issues on GitHub and list them here. This makes a big difference in how we prioritize and schedule work. -->
non_process
add a prominent link to the teleport labs within the docs applies to the docs landing page probably details the is a request from developer relations add a link to the from the docs so readers can try teleport without spinning up their own environment how will we know this is resolved there is a link to the teleport labs page from the docs related issues
0
145,018
5,557,157,151
IssuesEvent
2017-03-24 11:11:30
kubernetes-incubator/kompose
https://api.github.com/repos/kubernetes-incubator/kompose
closed
Improving `down` to handle Volumes
priority/P1
In 7349dc9 i'm proposing adding --emptyvols to the up/down operations. This raises a bigger question on whether we should enable passing the other **convert** options so that they can be used with up/down - this can be useful, but maybe not something we want to do.
1.0
Improving `down` to handle Volumes - In 7349dc9 i'm proposing adding --emptyvols to the up/down operations. This raises a bigger question on whether we should enable passing the other **convert** options so that they can be used with up/down - this can be useful, but maybe not something we want to do.
non_process
improving down to handle volumes in i m proposing adding emptyvols to the up down operations this raises a bigger question on whether we should enable passing the other convert options so that they can be used with up down this can be useful but maybe not something we want to do
0
15,485
19,694,693,566
IssuesEvent
2022-01-12 10:52:01
googleapis/python-bigtable
https://api.github.com/repos/googleapis/python-bigtable
closed
Fix samples CI runs under Python 3.9 / 3.10.
api: bigtable type: process samples
From PR #476, see [3.9](https://source.cloud.google.com/results/invocations/09b02d64-075e-4dd6-a81f-794b27bbc967) and [3.10](https://source.cloud.google.com/results/invocations/04364ec0-9ef4-42d7-affc-745cf7d7481b) samples run failures. The generated `noxfile.py` files in `samples/` don't have those sessions. I believe the correct fix is to add `noxfile_config.py` to each sample directory, overriding the generated set of Python versions to run.
1.0
Fix samples CI runs under Python 3.9 / 3.10. - From PR #476, see [3.9](https://source.cloud.google.com/results/invocations/09b02d64-075e-4dd6-a81f-794b27bbc967) and [3.10](https://source.cloud.google.com/results/invocations/04364ec0-9ef4-42d7-affc-745cf7d7481b) samples run failures. The generated `noxfile.py` files in `samples/` don't have those sessions. I believe the correct fix is to add `noxfile_config.py` to each sample directory, overriding the generated set of Python versions to run.
process
fix samples ci runs under python from pr see and samples run failures the generated noxfile py files in samples don t have those sessions i believe the correct fix is to add noxfile config py to each sample directory overriding the generated set of python versions to run
1
379,803
11,235,843,623
IssuesEvent
2020-01-09 09:15:28
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
thetechnicaltraders.us6.list-manage.com - site is not usable
browser-firefox engine-gecko priority-important
<!-- @browser: Firefox 72.0.1 --> <!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:71.0) Gecko/20100101 Firefox/71.0 --> <!-- @reported_with: --> **URL**: https://thetechnicaltraders.us6.list-manage.com/track/click?u=333a7c6a2dc511d85228db7b6&id=25afa799ed&e=390f881618 **Browser / Version**: Firefox 72.0.1 **Operating System**: Mac OS X 10.13 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: The video does not load. After I login, firefox opens a blank page and displays waiting for technical trader. **Steps to Reproduce**: Login as normal to Technical Trader daily page. It is a new url each day. Checked with Technical Trader and their server was not overloaded at the time I logged in. Safari can easily load the page. The page thought I was usiong Firefox 71 but I am on 72.0.1 <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
thetechnicaltraders.us6.list-manage.com - site is not usable - <!-- @browser: Firefox 72.0.1 --> <!-- @ua_header: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:71.0) Gecko/20100101 Firefox/71.0 --> <!-- @reported_with: --> **URL**: https://thetechnicaltraders.us6.list-manage.com/track/click?u=333a7c6a2dc511d85228db7b6&id=25afa799ed&e=390f881618 **Browser / Version**: Firefox 72.0.1 **Operating System**: Mac OS X 10.13 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: The video does not load. After I login, firefox opens a blank page and displays waiting for technical trader. **Steps to Reproduce**: Login as normal to Technical Trader daily page. It is a new url each day. Checked with Technical Trader and their server was not overloaded at the time I logged in. Safari can easily load the page. The page thought I was usiong Firefox 71 but I am on 72.0.1 <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
thetechnicaltraders list manage com site is not usable url browser version firefox operating system mac os x tested another browser yes problem type site is not usable description the video does not load after i login firefox opens a blank page and displays waiting for technical trader steps to reproduce login as normal to technical trader daily page it is a new url each day checked with technical trader and their server was not overloaded at the time i logged in safari can easily load the page the page thought i was usiong firefox but i am on browser configuration none from with ❤️
0
8,889
11,985,070,741
IssuesEvent
2020-04-07 16:51:54
kubeflow/code-intelligence
https://api.github.com/repos/kubeflow/code-intelligence
opened
[label bot] code duplication among notebooks
area/labelbot kind/feature kind/process priority/p2
I'm noticing a lot of code duplication between the various notebooks. Which makes it hard to identify which notebook to use. This is probably tech debt as a result of us creating new copies of code rather than refactoring and reusing. We should try to clean this up. 1. Code shared between notebooks should be moved into reusable functions, classes, or modules in the py directory 1. notebooks should call the reusable functions 1. notebooks should clearly explain what they are doing so its obvious how different notebooks compare. As an example the following two notebooks both seem to be fetching GitHub issues and computing embeddings * [issue_loader.ipynb](https://github.com/kubeflow/code-intelligence/blob/master/Label_Microservice/notebooks/issues_loader.ipynb) * [Get-GitHub-Issues.ipynb](https://github.com/kubeflow/code-intelligence/blob/master/Issue_Embeddings/notebooks/Get-GitHub-Issues.ipynb) * The former appears to be using the functions in [embeddings.py](https://github.com/kubeflow/code-intelligence/blob/master/py/code_intelligence/embeddings.py) * It looks like the latter is still defining those same functions inside the notebook
1.0
[label bot] code duplication among notebooks - I'm noticing a lot of code duplication between the various notebooks. Which makes it hard to identify which notebook to use. This is probably tech debt as a result of us creating new copies of code rather than refactoring and reusing. We should try to clean this up. 1. Code shared between notebooks should be moved into reusable functions, classes, or modules in the py directory 1. notebooks should call the reusable functions 1. notebooks should clearly explain what they are doing so its obvious how different notebooks compare. As an example the following two notebooks both seem to be fetching GitHub issues and computing embeddings * [issue_loader.ipynb](https://github.com/kubeflow/code-intelligence/blob/master/Label_Microservice/notebooks/issues_loader.ipynb) * [Get-GitHub-Issues.ipynb](https://github.com/kubeflow/code-intelligence/blob/master/Issue_Embeddings/notebooks/Get-GitHub-Issues.ipynb) * The former appears to be using the functions in [embeddings.py](https://github.com/kubeflow/code-intelligence/blob/master/py/code_intelligence/embeddings.py) * It looks like the latter is still defining those same functions inside the notebook
process
code duplication among notebooks i m noticing a lot of code duplication between the various notebooks which makes it hard to identify which notebook to use this is probably tech debt as a result of us creating new copies of code rather than refactoring and reusing we should try to clean this up code shared between notebooks should be moved into reusable functions classes or modules in the py directory notebooks should call the reusable functions notebooks should clearly explain what they are doing so its obvious how different notebooks compare as an example the following two notebooks both seem to be fetching github issues and computing embeddings the former appears to be using the functions in it looks like the latter is still defining those same functions inside the notebook
1
30,580
24,940,236,660
IssuesEvent
2022-10-31 18:17:45
dotnet/aspnetcore
https://api.github.com/repos/dotnet/aspnetcore
closed
Microsoft Dev Box integration with dotnet/aspnetcore
area-infrastructure feature-infrastructure-improvements
The [Microsoft Dev Box](https://techcommunity.microsoft.com/t5/azure-developer-community-blog/introducing-microsoft-dev-box/ba-p/3412063) was recently announced. > Dev teams preconfigure Dev Boxes for specific projects and tasks, enabling devs to get started quickly with an environment that’s ready to build and run their app in minutes. This issue tracks setting up the `dotnet/aspnetcore` repo within a [devbox environment](https://devbox.microsoft.com/), potentially with a rolling nightly build.
2.0
Microsoft Dev Box integration with dotnet/aspnetcore - The [Microsoft Dev Box](https://techcommunity.microsoft.com/t5/azure-developer-community-blog/introducing-microsoft-dev-box/ba-p/3412063) was recently announced. > Dev teams preconfigure Dev Boxes for specific projects and tasks, enabling devs to get started quickly with an environment that’s ready to build and run their app in minutes. This issue tracks setting up the `dotnet/aspnetcore` repo within a [devbox environment](https://devbox.microsoft.com/), potentially with a rolling nightly build.
non_process
microsoft dev box integration with dotnet aspnetcore the was recently announced dev teams preconfigure dev boxes for specific projects and tasks enabling devs to get started quickly with an environment that’s ready to build and run their app in minutes this issue tracks setting up the dotnet aspnetcore repo within a potentially with a rolling nightly build
0
14,671
17,788,802,478
IssuesEvent
2021-08-31 14:05:58
Leviatan-Analytics/LA-data-processing
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
closed
Document new ways to add value with the project [3]
Data Processing Week 4 Sprint 3
Add to research document all the different feature ideas that can be useful to the end users.
1.0
Document new ways to add value with the project [3] - Add to research document all the different feature ideas that can be useful to the end users.
process
document new ways to add value with the project add to research document all the different feature ideas that can be useful to the end users
1
34,988
4,958,787,131
IssuesEvent
2016-12-02 10:58:22
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
github.com/cockroachdb/cockroach/pkg/security: TestUseCerts failed under stress
Robot test-failure
SHA: https://github.com/cockroachdb/cockroach/commits/318358c6b5af6ef5769fbcf5ed7903b8c35eea13 Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=73959&tab=buildLog ``` I161202 07:40:48.552599 21 gossip/gossip.go:248 [n?] initial resolvers: [] W161202 07:40:48.552627 21 gossip/gossip.go:1124 [n?] no resolvers found; use --join to specify a connected node W161202 07:40:48.555622 21 server/status/runtime.go:116 Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I161202 07:40:48.555835 21 storage/engine/rocksdb.go:340 opening in memory rocksdb instance I161202 07:40:48.563527 21 server/config.go:443 1 storage engine initialized I161202 07:40:48.563857 21 server/node.go:419 [n?] store [n0,s0] not bootstrapped W161202 07:40:48.352088 66 util/hlc/hlc.go:145 backward time jump detected (-0.218985 seconds) I161202 07:40:48.352196 66 storage/replica_proposal.go:348 [s1,r1/1:/M{in-ax},@c4206ae900] new range lease replica {1 1 1} 1970-01-01 00:00:00 +0000 UTC 411295h40m57.570748182s following replica {0 0 0} 1970-01-01 00:00:00 +0000 UTC 0s [physicalTime=2016-12-02 07:40:48.352149731 +0000 UTC] I161202 07:40:48.352856 21 util/stop/stopper.go:396 stop has been called, stopping or quiescing all running tasks test_server_shim.go:120: expected to initialize store id allocator to 1, got 0: unable to allocate 1 store IDs for node 1: storage/store.go:2329: rejecting command with timestamp in the future: 1480664448571060059 (218.289474ms ahead) ```
1.0
github.com/cockroachdb/cockroach/pkg/security: TestUseCerts failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/318358c6b5af6ef5769fbcf5ed7903b8c35eea13 Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=73959&tab=buildLog ``` I161202 07:40:48.552599 21 gossip/gossip.go:248 [n?] initial resolvers: [] W161202 07:40:48.552627 21 gossip/gossip.go:1124 [n?] no resolvers found; use --join to specify a connected node W161202 07:40:48.555622 21 server/status/runtime.go:116 Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I161202 07:40:48.555835 21 storage/engine/rocksdb.go:340 opening in memory rocksdb instance I161202 07:40:48.563527 21 server/config.go:443 1 storage engine initialized I161202 07:40:48.563857 21 server/node.go:419 [n?] store [n0,s0] not bootstrapped W161202 07:40:48.352088 66 util/hlc/hlc.go:145 backward time jump detected (-0.218985 seconds) I161202 07:40:48.352196 66 storage/replica_proposal.go:348 [s1,r1/1:/M{in-ax},@c4206ae900] new range lease replica {1 1 1} 1970-01-01 00:00:00 +0000 UTC 411295h40m57.570748182s following replica {0 0 0} 1970-01-01 00:00:00 +0000 UTC 0s [physicalTime=2016-12-02 07:40:48.352149731 +0000 UTC] I161202 07:40:48.352856 21 util/stop/stopper.go:396 stop has been called, stopping or quiescing all running tasks test_server_shim.go:120: expected to initialize store id allocator to 1, got 0: unable to allocate 1 store IDs for node 1: storage/store.go:2329: rejecting command with timestamp in the future: 1480664448571060059 (218.289474ms ahead) ```
non_process
github com cockroachdb cockroach pkg security testusecerts failed under stress sha parameters cockroach proposer evaluated kv true tags stress goflags stress build found a failed test gossip gossip go initial resolvers gossip gossip go no resolvers found use join to specify a connected node server status runtime go could not parse build timestamp parsing time as cannot parse as storage engine rocksdb go opening in memory rocksdb instance server config go storage engine initialized server node go store not bootstrapped util hlc hlc go backward time jump detected seconds storage replica proposal go new range lease replica utc following replica utc util stop stopper go stop has been called stopping or quiescing all running tasks test server shim go expected to initialize store id allocator to got unable to allocate store ids for node storage store go rejecting command with timestamp in the future ahead
0
10,726
9,081,625,256
IssuesEvent
2019-02-17 03:28:50
AlwaysBeCrafting/Mysticell
https://api.github.com/repos/AlwaysBeCrafting/Mysticell
opened
Migrate redux to data hooks
enhancement service-web
## Prerequisites - #10 Migrate to React v16.8 / React Hooks ## Summary Mysticell is a good place to demo and iterate on the data hooks pattern (see [`useStore()` and Data Hooks](https://gist.github.com/Alexander-Prime/7b7f2ef07fb3fb3587672bb775ac3140#usestore-and-data-hooks)). Currently we use `redux`, along with `react-redux`'s cumbersome `connect()` function to inject store data into component props. Data hooks represent a cleaner way to subscribe to this data without the need for a higher-order component. ## Objectives - [ ] Implement a `Store` component to manage application state and reducers and accept actions (`React.useReducer`) - [ ] Remove Redux dependencies (if any) in the existing reducers, action creators, etc (This will probably disrupt our HTTP client functionality, as it depends on `redux-observable`. May need to reimplement this over data hooks) - Implement data hooks for all existing entities - [ ] `Cell` - [ ] `Directory` - [ ] `Document` - [ ] `Node` - [ ] `Primitive` - [ ] `Sheet` - [ ] `Source` - [ ] `Wire` - [ ] Replace connected component props with data hooks - [ ] Remove `Connected...` components
1.0
Migrate redux to data hooks - ## Prerequisites - #10 Migrate to React v16.8 / React Hooks ## Summary Mysticell is a good place to demo and iterate on the data hooks pattern (see [`useStore()` and Data Hooks](https://gist.github.com/Alexander-Prime/7b7f2ef07fb3fb3587672bb775ac3140#usestore-and-data-hooks)). Currently we use `redux`, along with `react-redux`'s cumbersome `connect()` function to inject store data into component props. Data hooks represent a cleaner way to subscribe to this data without the need for a higher-order component. ## Objectives - [ ] Implement a `Store` component to manage application state and reducers and accept actions (`React.useReducer`) - [ ] Remove Redux dependencies (if any) in the existing reducers, action creators, etc (This will probably disrupt our HTTP client functionality, as it depends on `redux-observable`. May need to reimplement this over data hooks) - Implement data hooks for all existing entities - [ ] `Cell` - [ ] `Directory` - [ ] `Document` - [ ] `Node` - [ ] `Primitive` - [ ] `Sheet` - [ ] `Source` - [ ] `Wire` - [ ] Replace connected component props with data hooks - [ ] Remove `Connected...` components
non_process
migrate redux to data hooks prerequisites migrate to react react hooks summary mysticell is a good place to demo and iterate on the data hooks pattern see currently we use redux along with react redux s cumbersome connect function to inject store data into component props data hooks represent a cleaner way to subscribe to this data without the need for a higher order component objectives implement a store component to manage application state and reducers and accept actions react usereducer remove redux dependencies if any in the existing reducers action creators etc this will probably disrupt our http client functionality as it depends on redux observable may need to reimplement this over data hooks implement data hooks for all existing entities cell directory document node primitive sheet source wire replace connected component props with data hooks remove connected components
0
338,822
10,237,920,661
IssuesEvent
2019-08-19 14:50:21
adamcaudill/yawast
https://api.github.com/repos/adamcaudill/yawast
opened
Connection error in check_local_ip_disclosure
bug high-priority
User reported issue: ``` Completed (Elapsed: 6:20:51.327802 - Peak Memory: 3.14GB) Traceback (most recent call last): File "/usr/local/bin/yawast", line 42, in <module> main.main() File "/usr/local/lib/python3.7/site-packages/yawast/main.py", line 76, in main args.func(args, urls) File "/usr/local/lib/python3.7/site-packages/yawast/command_line.py", line 156, in command_scan scan.start(session) File "/usr/local/lib/python3.7/site-packages/yawast/commands/scan.py", line 62, in start http.scan(session) File "/usr/local/lib/python3.7/site-packages/yawast/scanner/cli/http.py", line 99, in scan res = http_basic.check_local_ip_disclosure(session) File "/usr/local/lib/python3.7/site-packages/yawast/scanner/plugins/http/http_basic.py", line 343, in check_local_ip_disclosure conn.connect() File "/usr/local/lib/python3.7/site-packages/sslyze/utils/ssl_connection.py", line 96, in connect self.do_pre_handshake(network_timeout) File "/usr/local/lib/python3.7/site-packages/sslyze/utils/ssl_connection.py", line 80, in do_pre_handshake sock = self._socket_helper.create_connection(final_timeout) File "/usr/local/lib/python3.7/site-packages/sslyze/utils/connection_helpers.py", line 36, in create_connection sock = socket.create_connection((self._server_ip_addr, self._server_port), timeout=timeout) File "/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/socket.py", line 727, in create_connection raise err File "/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/socket.py", line 716, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 61] Connection refused ```
1.0
Connection error in check_local_ip_disclosure - User reported issue: ``` Completed (Elapsed: 6:20:51.327802 - Peak Memory: 3.14GB) Traceback (most recent call last): File "/usr/local/bin/yawast", line 42, in <module> main.main() File "/usr/local/lib/python3.7/site-packages/yawast/main.py", line 76, in main args.func(args, urls) File "/usr/local/lib/python3.7/site-packages/yawast/command_line.py", line 156, in command_scan scan.start(session) File "/usr/local/lib/python3.7/site-packages/yawast/commands/scan.py", line 62, in start http.scan(session) File "/usr/local/lib/python3.7/site-packages/yawast/scanner/cli/http.py", line 99, in scan res = http_basic.check_local_ip_disclosure(session) File "/usr/local/lib/python3.7/site-packages/yawast/scanner/plugins/http/http_basic.py", line 343, in check_local_ip_disclosure conn.connect() File "/usr/local/lib/python3.7/site-packages/sslyze/utils/ssl_connection.py", line 96, in connect self.do_pre_handshake(network_timeout) File "/usr/local/lib/python3.7/site-packages/sslyze/utils/ssl_connection.py", line 80, in do_pre_handshake sock = self._socket_helper.create_connection(final_timeout) File "/usr/local/lib/python3.7/site-packages/sslyze/utils/connection_helpers.py", line 36, in create_connection sock = socket.create_connection((self._server_ip_addr, self._server_port), timeout=timeout) File "/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/socket.py", line 727, in create_connection raise err File "/usr/local/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/socket.py", line 716, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 61] Connection refused ```
non_process
connection error in check local ip disclosure user reported issue completed elapsed peak memory traceback most recent call last file usr local bin yawast line in main main file usr local lib site packages yawast main py line in main args func args urls file usr local lib site packages yawast command line py line in command scan scan start session file usr local lib site packages yawast commands scan py line in start http scan session file usr local lib site packages yawast scanner cli http py line in scan res http basic check local ip disclosure session file usr local lib site packages yawast scanner plugins http http basic py line in check local ip disclosure conn connect file usr local lib site packages sslyze utils ssl connection py line in connect self do pre handshake network timeout file usr local lib site packages sslyze utils ssl connection py line in do pre handshake sock self socket helper create connection final timeout file usr local lib site packages sslyze utils connection helpers py line in create connection sock socket create connection self server ip addr self server port timeout timeout file usr local cellar python frameworks python framework versions lib socket py line in create connection raise err file usr local cellar python frameworks python framework versions lib socket py line in create connection sock connect sa connectionrefusederror connection refused
0
21,427
29,359,594,056
IssuesEvent
2023-05-28 00:37:18
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[QA Automation] [NODE.JS] QA Automation na @YellowIpe
TESTE PJ JAVASCRIPT SCRUM NODE.JS REQUISITOS SELENIUM REMOTO PROCESSOS GITHUB KANBAN UMA C QUALIDADE ESPRESSO PADRÕ QA TESTES AUTOMATIZADOS METODOLOGIAS ÁGEIS HELP WANTED ESPECIALISTA UI XCTEST AUTOMAÇÃO DE TESTES Stale
<!-- IMPORTANTE: 1. VOCÊ PODE SER BLOQUEADO de divulgar vaga se realizar ações que geram spam como: 1.1. Abrir diversas issues para a mesma vaga em um curto período de tempo. 1.2. Fechar qualquer issue. Se alguma issue aberta por você é referente a uma vaga que não está mais aberta, mude o título da issue para [VAGA FECHADA]. NÃO feche a issue ou será bloqueado. Para saber mais leia: https://github.com/qa-brasil/vagas#%EF%B8%8F-importante---leia-para-n%C3%A3o-ser-bloqueado. 2. Só poste vaga de QA. 3. Utilize o template abaixo como guia, sem apagar as seções e apenas alterando as informações de dentro delas. 4. Não faça distinção de gênero. 5. Respeite o código de conduta --> ## Descrição da vaga A YellowIpe está recrutando Engenheiro de Automação de Testes. você garantirá que os altos padrões sejam mantidos durante todo o processo de desenvolvimento e nas entregas do produto. Você também terá um papel importante no planejamento e desenvolvimento de conjuntos de testes automatizados. ## Local de trabalho <!-- escolha apenas 1 das linhas abaixo, ajuste para o seu contexto e exclua as outras --> - 100% Remoto - (Necessário Mudança Posterior para Portugal) Cuidamos na Emissão do visto e ajudamos com o processo de imigração ✅ ## Requisitos da vaga **Obrigatórios** - Metodologias ágeis como Scrum e Kanban - Liderando processos de garantia de qualidade e testes funcionais - Trabalhar com engenheiros de software para garantir que os níveis certos de automação de teste sejam criados de acordo com seus padrões - Testando aplicativos móveis e da web - Usando sistemas de controle de origem - Focado em Automação. - Muito importante ter experiência em Javascript ou Node.js (mínimo 3 anos de experiência) - Ser autodirigido e trabalhar com pouca supervisão em direção a uma equipe ou objetivo comum da empresa - Estar motivado para continuar aprendendo sobre automação de teste **Desejáveis** - Criação de conjuntos de testes automatizados resilientes para exercitar aplicativos Web e móveis com ferramentas como Selenium, Espresso, UI Automator, KIF ou XCTest ## Forma de contratação PJ (Recibos Verdes) ou CLT (CTI) ## Benefícios da empresa - Seguro de vida e de saúde; - Plano de Carreira crescente assente em meritocracia; - Formação certificada. ## Range Salarial A Definir mediante Senioridadel ## Sobre a empresa A YellowIpe é uma consultoria da área de tecnologia presente no Brasil e em Portugal que pretende reforçar a sua equipe com QA Automations. Trabalhamos com empresas nacionais e multinacionais em diversos setores de atividade que confiam em nós e no diferencial dos nossos profissionais. ## Como se candidatar - Por favor envie um email para recrutamento@yellowipe.io com seu CV anexado - Enviar no assunto: Vaga do QA-Automation ## Tempo médio de feedbacks Costumamos enviar feedbacks em até 7 dias após cada processo. E-mail para contato em caso de não haver resposta: recrutamento@yellowipe.io ## Labels <!-- Label é uma forma dos QAs conseguirem pesquisar as vagas. Após cadastrar a vaga, ela receberá label automaticamente em até 2 minutos. Apague os itens abaixo que não corresponderem à vaga. --> - CLT - PJ - Sênior - Especialista <!-- Se a vaga for mista (X dias presencial e Y dias remoto) deixe as 2 linhas abaixo --> - Remoto <!--- Informar abaixo o nome do estado por extenso se for do Brasil. ---> <!--- Se sua vaga não for do Brasil, apague a linha de estado acima e deixe a de baixo ---> - Exterior ## Recrutador: NÃO feche essa issue caso a vaga não exista mais ou poderá ser bloqueado. Leia https://github.com/qa-brasil/vagas#%EF%B8%8F-importante---leia-para-n%C3%A3o-ser-bloqueado para saber o que fazer.
1.0
[QA Automation] [NODE.JS] QA Automation na @YellowIpe - <!-- IMPORTANTE: 1. VOCÊ PODE SER BLOQUEADO de divulgar vaga se realizar ações que geram spam como: 1.1. Abrir diversas issues para a mesma vaga em um curto período de tempo. 1.2. Fechar qualquer issue. Se alguma issue aberta por você é referente a uma vaga que não está mais aberta, mude o título da issue para [VAGA FECHADA]. NÃO feche a issue ou será bloqueado. Para saber mais leia: https://github.com/qa-brasil/vagas#%EF%B8%8F-importante---leia-para-n%C3%A3o-ser-bloqueado. 2. Só poste vaga de QA. 3. Utilize o template abaixo como guia, sem apagar as seções e apenas alterando as informações de dentro delas. 4. Não faça distinção de gênero. 5. Respeite o código de conduta --> ## Descrição da vaga A YellowIpe está recrutando Engenheiro de Automação de Testes. você garantirá que os altos padrões sejam mantidos durante todo o processo de desenvolvimento e nas entregas do produto. Você também terá um papel importante no planejamento e desenvolvimento de conjuntos de testes automatizados. ## Local de trabalho <!-- escolha apenas 1 das linhas abaixo, ajuste para o seu contexto e exclua as outras --> - 100% Remoto - (Necessário Mudança Posterior para Portugal) Cuidamos na Emissão do visto e ajudamos com o processo de imigração ✅ ## Requisitos da vaga **Obrigatórios** - Metodologias ágeis como Scrum e Kanban - Liderando processos de garantia de qualidade e testes funcionais - Trabalhar com engenheiros de software para garantir que os níveis certos de automação de teste sejam criados de acordo com seus padrões - Testando aplicativos móveis e da web - Usando sistemas de controle de origem - Focado em Automação. - Muito importante ter experiência em Javascript ou Node.js (mínimo 3 anos de experiência) - Ser autodirigido e trabalhar com pouca supervisão em direção a uma equipe ou objetivo comum da empresa - Estar motivado para continuar aprendendo sobre automação de teste **Desejáveis** - Criação de conjuntos de testes automatizados resilientes para exercitar aplicativos Web e móveis com ferramentas como Selenium, Espresso, UI Automator, KIF ou XCTest ## Forma de contratação PJ (Recibos Verdes) ou CLT (CTI) ## Benefícios da empresa - Seguro de vida e de saúde; - Plano de Carreira crescente assente em meritocracia; - Formação certificada. ## Range Salarial A Definir mediante Senioridadel ## Sobre a empresa A YellowIpe é uma consultoria da área de tecnologia presente no Brasil e em Portugal que pretende reforçar a sua equipe com QA Automations. Trabalhamos com empresas nacionais e multinacionais em diversos setores de atividade que confiam em nós e no diferencial dos nossos profissionais. ## Como se candidatar - Por favor envie um email para recrutamento@yellowipe.io com seu CV anexado - Enviar no assunto: Vaga do QA-Automation ## Tempo médio de feedbacks Costumamos enviar feedbacks em até 7 dias após cada processo. E-mail para contato em caso de não haver resposta: recrutamento@yellowipe.io ## Labels <!-- Label é uma forma dos QAs conseguirem pesquisar as vagas. Após cadastrar a vaga, ela receberá label automaticamente em até 2 minutos. Apague os itens abaixo que não corresponderem à vaga. --> - CLT - PJ - Sênior - Especialista <!-- Se a vaga for mista (X dias presencial e Y dias remoto) deixe as 2 linhas abaixo --> - Remoto <!--- Informar abaixo o nome do estado por extenso se for do Brasil. ---> <!--- Se sua vaga não for do Brasil, apague a linha de estado acima e deixe a de baixo ---> - Exterior ## Recrutador: NÃO feche essa issue caso a vaga não exista mais ou poderá ser bloqueado. Leia https://github.com/qa-brasil/vagas#%EF%B8%8F-importante---leia-para-n%C3%A3o-ser-bloqueado para saber o que fazer.
process
qa automation na yellowipe importante você pode ser bloqueado de divulgar vaga se realizar ações que geram spam como abrir diversas issues para a mesma vaga em um curto período de tempo fechar qualquer issue se alguma issue aberta por você é referente a uma vaga que não está mais aberta mude o título da issue para não feche a issue ou será bloqueado para saber mais leia só poste vaga de qa utilize o template abaixo como guia sem apagar as seções e apenas alterando as informações de dentro delas não faça distinção de gênero respeite o código de conduta descrição da vaga a yellowipe está recrutando engenheiro de automação de testes você garantirá que os altos padrões sejam mantidos durante todo o processo de desenvolvimento e nas entregas do produto você também terá um papel importante no planejamento e desenvolvimento de conjuntos de testes automatizados local de trabalho remoto necessário mudança posterior para portugal cuidamos na emissão do visto e ajudamos com o processo de imigração ✅ requisitos da vaga obrigatórios metodologias ágeis como scrum e kanban liderando processos de garantia de qualidade e testes funcionais trabalhar com engenheiros de software para garantir que os níveis certos de automação de teste sejam criados de acordo com seus padrões testando aplicativos móveis e da web usando sistemas de controle de origem focado em automação muito importante ter experiência em javascript ou node js mínimo anos de experiência ser autodirigido e trabalhar com pouca supervisão em direção a uma equipe ou objetivo comum da empresa estar motivado para continuar aprendendo sobre automação de teste desejáveis criação de conjuntos de testes automatizados resilientes para exercitar aplicativos web e móveis com ferramentas como selenium espresso ui automator kif ou xctest forma de contratação pj recibos verdes ou clt cti benefícios da empresa seguro de vida e de saúde plano de carreira crescente assente em meritocracia formação certificada range salarial a definir mediante senioridadel sobre a empresa a yellowipe é uma consultoria da área de tecnologia presente no brasil e em portugal que pretende reforçar a sua equipe com qa automations trabalhamos com empresas nacionais e multinacionais em diversos setores de atividade que confiam em nós e no diferencial dos nossos profissionais como se candidatar por favor envie um email para recrutamento yellowipe io com seu cv anexado enviar no assunto vaga do qa automation tempo médio de feedbacks costumamos enviar feedbacks em até dias após cada processo e mail para contato em caso de não haver resposta recrutamento yellowipe io labels label é uma forma dos qas conseguirem pesquisar as vagas após cadastrar a vaga ela receberá label automaticamente em até minutos apague os itens abaixo que não corresponderem à vaga clt pj sênior especialista remoto exterior recrutador não feche essa issue caso a vaga não exista mais ou poderá ser bloqueado leia para saber o que fazer
1
289,470
21,784,775,072
IssuesEvent
2022-05-14 01:16:28
Equipment-and-Tool-Institute/j1939-84
https://api.github.com/repos/Equipment-and-Tool-Institute/j1939-84
closed
Support DEF Used This Operating Cycle in Table A-1
documentation Mtask8 DR
Support DEF Used This Operating Cycle in Table A-1 DEF Used This Operating Cycle in Table A-1 is needed for 2024 MY engines
1.0
Support DEF Used This Operating Cycle in Table A-1 - Support DEF Used This Operating Cycle in Table A-1 DEF Used This Operating Cycle in Table A-1 is needed for 2024 MY engines
non_process
support def used this operating cycle in table a support def used this operating cycle in table a def used this operating cycle in table a is needed for my engines
0
16,401
21,184,277,099
IssuesEvent
2022-04-08 11:02:55
paul-buerkner/brms
https://api.github.com/repos/paul-buerkner/brms
closed
error-adding-loo-criterion-to-brms-ordinal-model-with-link-cloglog
bug post-processing
https://discourse.mc-stan.org/t/error-adding-loo-criterion-to-brms-ordinal-model-with-link-cloglog/22600 The model looks fine but when I try to add loo or waic criterion I get the error `validate_ll(x) : All input values must be finite` I'm not convinced that the `cloglog` link is appropriate. I get fairly good results using the 'logit' link. The data are very skewed so I was hoping to check. I'm fairly confident that the `sratio` model is a good model of the data generating process. ``` df <- read.csv("test_error_validate_ll_ordinal.csv") df[,'wp10'] <- factor(df[,'wp10'],levels=c("stub",'start','c','b','ga','fa'),ordered=T) fam.cloglog <- sratio(link='cloglog', threshold='flexible') formula <- brmsformula(wp10 | weights(weight) ~ Stub + Start + B + FA + GA, decomp='QR',center=TRUE) mod <- brm(formula=formula.main.noC, data=df, family=fam.cloglog) mod <- add_criterion(model.main.noC.cloglog,'loo') ``` I have uploaded my dataset (n=2000, 258kb) here: https://teblunthuis.cc/outgoing/test_error_validate_ll_ordinal.csv * Operating System: centos 7.6.1810 * brms Version: 2.15.0 Thank you so much for the help!
1.0
error-adding-loo-criterion-to-brms-ordinal-model-with-link-cloglog - https://discourse.mc-stan.org/t/error-adding-loo-criterion-to-brms-ordinal-model-with-link-cloglog/22600 The model looks fine but when I try to add loo or waic criterion I get the error `validate_ll(x) : All input values must be finite` I'm not convinced that the `cloglog` link is appropriate. I get fairly good results using the 'logit' link. The data are very skewed so I was hoping to check. I'm fairly confident that the `sratio` model is a good model of the data generating process. ``` df <- read.csv("test_error_validate_ll_ordinal.csv") df[,'wp10'] <- factor(df[,'wp10'],levels=c("stub",'start','c','b','ga','fa'),ordered=T) fam.cloglog <- sratio(link='cloglog', threshold='flexible') formula <- brmsformula(wp10 | weights(weight) ~ Stub + Start + B + FA + GA, decomp='QR',center=TRUE) mod <- brm(formula=formula.main.noC, data=df, family=fam.cloglog) mod <- add_criterion(model.main.noC.cloglog,'loo') ``` I have uploaded my dataset (n=2000, 258kb) here: https://teblunthuis.cc/outgoing/test_error_validate_ll_ordinal.csv * Operating System: centos 7.6.1810 * brms Version: 2.15.0 Thank you so much for the help!
process
error adding loo criterion to brms ordinal model with link cloglog the model looks fine but when i try to add loo or waic criterion i get the error validate ll x all input values must be finite i m not convinced that the cloglog link is appropriate i get fairly good results using the logit link the data are very skewed so i was hoping to check i m fairly confident that the sratio model is a good model of the data generating process df read csv test error validate ll ordinal csv df factor df levels c stub start c b ga fa ordered t fam cloglog sratio link cloglog threshold flexible formula brmsformula weights weight stub start b fa ga decomp qr center true mod brm formula formula main noc data df family fam cloglog mod add criterion model main noc cloglog loo i have uploaded my dataset n here operating system centos brms version thank you so much for the help
1
679,040
23,219,572,438
IssuesEvent
2022-08-02 16:51:16
KinsonDigital/Infrastructure
https://api.github.com/repos/KinsonDigital/Infrastructure
closed
🚧Create re-usable workflow to resolve project files
high priority workflow
### I have done the items below . . . - [X] I have updated the title without removing the 🚧 emoji. ### Description Create a workflow with the name `resolve-charp-proj-file.yml` that can be used to resolve the path to a C# project file. ### Acceptance Criteria **This issue is finished when:** - [x] Workflow created ### ToDo Items - [X] Priority label added to this issue. Refer to the _**Priority Type Labels**_ section below. - [X] Change type labels added to this issue. Refer to the _**Change Type Labels**_ section below. - [X] Issue linked to the correct project. - [X] Issue linked to the correct milestone. - [x] Draft pull request created and linked to this issue. ### Issue Dependencies _No response_ ### Related Work _No response_ ### Additional Information: **_<details open><summary>Change Type Labels</summary>_** | Change Type | Label | |---------------------|--------------------------------------------------------------------------------------| | Bug Fixes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%90%9Bbug | | Breaking Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%A7%A8breaking%20changes | | Enhancement | https://github.com/KinsonDigital/Infrastructure/labels/enhancement | | Workflow Changes | https://github.com/KinsonDigital/Infrastructure/labels/workflow | | Code Doc Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%93%91documentation%2Fcode | | Product Doc Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%93%9Ddocumentation%2Fproduct | </details> **_<details open><summary>Priority Type Labels</summary>_** | Priority Type | Label | |---------------------|--------------------------------------------------------------------| | Low Priority | https://github.com/KinsonDigital/Infrastructure/labels/low%20priority | | Medium Priority | https://github.com/KinsonDigital/Infrastructure/labels/medium%20priority | | High Priority | https://github.com/KinsonDigital/Infrastructure/labels/high%20priority | </details> ### Code of Conduct - [X] I agree to follow this project's Code of Conduct.
1.0
🚧Create re-usable workflow to resolve project files - ### I have done the items below . . . - [X] I have updated the title without removing the 🚧 emoji. ### Description Create a workflow with the name `resolve-charp-proj-file.yml` that can be used to resolve the path to a C# project file. ### Acceptance Criteria **This issue is finished when:** - [x] Workflow created ### ToDo Items - [X] Priority label added to this issue. Refer to the _**Priority Type Labels**_ section below. - [X] Change type labels added to this issue. Refer to the _**Change Type Labels**_ section below. - [X] Issue linked to the correct project. - [X] Issue linked to the correct milestone. - [x] Draft pull request created and linked to this issue. ### Issue Dependencies _No response_ ### Related Work _No response_ ### Additional Information: **_<details open><summary>Change Type Labels</summary>_** | Change Type | Label | |---------------------|--------------------------------------------------------------------------------------| | Bug Fixes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%90%9Bbug | | Breaking Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%A7%A8breaking%20changes | | Enhancement | https://github.com/KinsonDigital/Infrastructure/labels/enhancement | | Workflow Changes | https://github.com/KinsonDigital/Infrastructure/labels/workflow | | Code Doc Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%93%91documentation%2Fcode | | Product Doc Changes | https://github.com/KinsonDigital/Infrastructure/labels/%F0%9F%93%9Ddocumentation%2Fproduct | </details> **_<details open><summary>Priority Type Labels</summary>_** | Priority Type | Label | |---------------------|--------------------------------------------------------------------| | Low Priority | https://github.com/KinsonDigital/Infrastructure/labels/low%20priority | | Medium Priority | https://github.com/KinsonDigital/Infrastructure/labels/medium%20priority | | High Priority | https://github.com/KinsonDigital/Infrastructure/labels/high%20priority | </details> ### Code of Conduct - [X] I agree to follow this project's Code of Conduct.
non_process
🚧create re usable workflow to resolve project files i have done the items below i have updated the title without removing the 🚧 emoji description create a workflow with the name resolve charp proj file yml that can be used to resolve the path to a c project file acceptance criteria this issue is finished when workflow created todo items priority label added to this issue refer to the priority type labels section below change type labels added to this issue refer to the change type labels section below issue linked to the correct project issue linked to the correct milestone draft pull request created and linked to this issue issue dependencies no response related work no response additional information change type labels change type label bug fixes breaking changes enhancement workflow changes code doc changes product doc changes priority type labels priority type label low priority medium priority high priority code of conduct i agree to follow this project s code of conduct
0
7,063
10,219,207,715
IssuesEvent
2019-08-15 17:58:37
toggl/mobileapp
https://api.github.com/repos/toggl/mobileapp
closed
Write localization framework doc
process
This doc should explain how the whole translation workflow works. It should: - explain the idea of the translations being crowdsourced - explain how we'll use a project(s) for the translations - mention the translation guide from #4834 - how we organize our `Resource.<language-code>.resx` files (and comments inside) - explain when and how a translation PR can be merged back into develop
1.0
Write localization framework doc - This doc should explain how the whole translation workflow works. It should: - explain the idea of the translations being crowdsourced - explain how we'll use a project(s) for the translations - mention the translation guide from #4834 - how we organize our `Resource.<language-code>.resx` files (and comments inside) - explain when and how a translation PR can be merged back into develop
process
write localization framework doc this doc should explain how the whole translation workflow works it should explain the idea of the translations being crowdsourced explain how we ll use a project s for the translations mention the translation guide from how we organize our resource resx files and comments inside explain when and how a translation pr can be merged back into develop
1
13,321
15,786,571,084
IssuesEvent
2021-04-01 17:56:36
hasura/ask-me-anything
https://api.github.com/repos/hasura/ask-me-anything
closed
What's the history behind the devil looking logo of Hasura?
processing-for-shortvid question
The company name Hasura is a portmanteu of Haskell and Asura. * [Haskell](https://en.wikipedia.org/wiki/Haskell_%28programming_language%29) being the language stack that Hasura is built in. * [Asura](https://en.wikipedia.org/wiki/Asura) in Hindu mythology, is a class of beings defined by their opposition to the devas or suras.
1.0
What's the history behind the devil looking logo of Hasura? - The company name Hasura is a portmanteu of Haskell and Asura. * [Haskell](https://en.wikipedia.org/wiki/Haskell_%28programming_language%29) being the language stack that Hasura is built in. * [Asura](https://en.wikipedia.org/wiki/Asura) in Hindu mythology, is a class of beings defined by their opposition to the devas or suras.
process
what s the history behind the devil looking logo of hasura the company name hasura is a portmanteu of haskell and asura being the language stack that hasura is built in in hindu mythology is a class of beings defined by their opposition to the devas or suras
1
333,373
29,577,979,663
IssuesEvent
2023-06-07 01:39:54
Joystream/pioneer
https://api.github.com/repos/Joystream/pioneer
closed
Verified status: no visual representation in Pioneer UI
enhancement high-prio community-dev qa-tested-ready-for-prod release:1.5.0
Currently there is no visual representation of a user having the `verified: true` flag on their membership on Pioneer, this includes: - Member directory - Forums - Proposals I wouldn't say this is a high-priority thing to fix, but given the coming new Staking UI/UX and the fact that `jsgenesis` and `Gleev` memberships are now verified (https://pioneerapp.xyz/#/proposals/preview/219) this should ideally be implemented.
1.0
Verified status: no visual representation in Pioneer UI - Currently there is no visual representation of a user having the `verified: true` flag on their membership on Pioneer, this includes: - Member directory - Forums - Proposals I wouldn't say this is a high-priority thing to fix, but given the coming new Staking UI/UX and the fact that `jsgenesis` and `Gleev` memberships are now verified (https://pioneerapp.xyz/#/proposals/preview/219) this should ideally be implemented.
non_process
verified status no visual representation in pioneer ui currently there is no visual representation of a user having the verified true flag on their membership on pioneer this includes member directory forums proposals i wouldn t say this is a high priority thing to fix but given the coming new staking ui ux and the fact that jsgenesis and gleev memberships are now verified this should ideally be implemented
0
29,868
13,179,720,779
IssuesEvent
2020-08-12 11:29:13
terraform-providers/terraform-provider-aws
https://api.github.com/repos/terraform-providers/terraform-provider-aws
closed
Terraform does not support event_bus_name in aws_cloudwatch_event_rule
enhancement service/cloudwatch service/cloudwatchevents service/eventbridge
<!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description `aws_cloudwatch_event_rule` does not support `event_bus_name` parameter. Also, terraform does not support creating a custom event bus. ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * aws_cloudwatch_event_rule * aws_event_bus ### Potential Terraform Configuration <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_event_bus" "custom_bus" { name = "custom-bus" tags = {} } resource "aws_cloudwatch_event_rule" "console" { name = "capture-aws-sign-in" description = "Capture each AWS Console Sign In" event_pattern = <<EOF { "detail-type": [ "AWS Console Sign In via CloudTrail" ] } EOF event_bus_name = aws_event_bus.custom_bus.name } ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/ ---> * https://docs.aws.amazon.com/cli/latest/reference/events/create-event-bus.html * https://docs.aws.amazon.com/cli/latest/reference/events/put-rule.html
3.0
Terraform does not support event_bus_name in aws_cloudwatch_event_rule - <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description `aws_cloudwatch_event_rule` does not support `event_bus_name` parameter. Also, terraform does not support creating a custom event bus. ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * aws_cloudwatch_event_rule * aws_event_bus ### Potential Terraform Configuration <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_event_bus" "custom_bus" { name = "custom-bus" tags = {} } resource "aws_cloudwatch_event_rule" "console" { name = "capture-aws-sign-in" description = "Capture each AWS Console Sign In" event_pattern = <<EOF { "detail-type": [ "AWS Console Sign In via CloudTrail" ] } EOF event_bus_name = aws_event_bus.custom_bus.name } ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/ ---> * https://docs.aws.amazon.com/cli/latest/reference/events/create-event-bus.html * https://docs.aws.amazon.com/cli/latest/reference/events/put-rule.html
non_process
terraform does not support event bus name in aws cloudwatch event rule community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description aws cloudwatch event rule does not support event bus name parameter also terraform does not support creating a custom event bus new or affected resource s aws cloudwatch event rule aws event bus potential terraform configuration hcl resource aws event bus custom bus name custom bus tags resource aws cloudwatch event rule console name capture aws sign in description capture each aws console sign in event pattern eof detail type aws console sign in via cloudtrail eof event bus name aws event bus custom bus name references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example
0
135,434
10,987,084,746
IssuesEvent
2019-12-02 08:26:34
dzhw/SLC-IntEr
https://api.github.com/repos/dzhw/SLC-IntEr
closed
Befragungseinstieg Mobile Endgeräte
Hohe Priorität bug testing
Auf mobilen Endgeräten wird man zur Zeit nach Eingabe des Tokens direkt auf die Seite st0001.html geleitet. Der komplette erste Befragungsteil (Einstieg/beg) sowie das Kalendarium werden übersprungen.
1.0
Befragungseinstieg Mobile Endgeräte - Auf mobilen Endgeräten wird man zur Zeit nach Eingabe des Tokens direkt auf die Seite st0001.html geleitet. Der komplette erste Befragungsteil (Einstieg/beg) sowie das Kalendarium werden übersprungen.
non_process
befragungseinstieg mobile endgeräte auf mobilen endgeräten wird man zur zeit nach eingabe des tokens direkt auf die seite html geleitet der komplette erste befragungsteil einstieg beg sowie das kalendarium werden übersprungen
0
5,927
8,751,241,809
IssuesEvent
2018-12-13 21:41:43
de-ai/designengine.ai
https://api.github.com/repos/de-ai/designengine.ai
closed
[Processing] Processing is processing the wrong file IF it was not closed correctly
Processing
example... http://earlyaccess.designengine.ai/proj/2/design-engine-v11 This file isn't the file I uploaded
1.0
[Processing] Processing is processing the wrong file IF it was not closed correctly - example... http://earlyaccess.designengine.ai/proj/2/design-engine-v11 This file isn't the file I uploaded
process
processing is processing the wrong file if it was not closed correctly example this file isn t the file i uploaded
1
12,191
14,742,299,477
IssuesEvent
2021-01-07 12:03:12
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Orlando 045- Missing Payment
anc-process anp-0.5 ant-enhancement
In GitLab by @kdjstudios on Apr 3, 2019, 11:50 **Submitted by:** "David Difo" <david.difo@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-04-03-25939/conversation **Server:** Internal **Client/Site:** Orlando **Account:** **Issue:** We have client Ficurma 045-S8762 who has reached out stating E-check payment processed on 2/17/19 does not show under bank statement. When looking at SAB I noticed the payment did go through and it was applied to their account. Client wants to make sure the payment did go through, can we confirm the payment was taken from the same account used for E-check payment on 3/16/19? The customer just wants to double check we are paid up as well, I let them know the payment did go through but they still wanted me to reach out and make sure the account was the same and to double check. Your assistance would be appreciated.
1.0
Orlando 045- Missing Payment - In GitLab by @kdjstudios on Apr 3, 2019, 11:50 **Submitted by:** "David Difo" <david.difo@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-04-03-25939/conversation **Server:** Internal **Client/Site:** Orlando **Account:** **Issue:** We have client Ficurma 045-S8762 who has reached out stating E-check payment processed on 2/17/19 does not show under bank statement. When looking at SAB I noticed the payment did go through and it was applied to their account. Client wants to make sure the payment did go through, can we confirm the payment was taken from the same account used for E-check payment on 3/16/19? The customer just wants to double check we are paid up as well, I let them know the payment did go through but they still wanted me to reach out and make sure the account was the same and to double check. Your assistance would be appreciated.
process
orlando missing payment in gitlab by kdjstudios on apr submitted by david difo helpdesk server internal client site orlando account issue we have client ficurma who has reached out stating e check payment processed on does not show under bank statement when looking at sab i noticed the payment did go through and it was applied to their account client wants to make sure the payment did go through can we confirm the payment was taken from the same account used for e check payment on the customer just wants to double check we are paid up as well i let them know the payment did go through but they still wanted me to reach out and make sure the account was the same and to double check your assistance would be appreciated
1
12,571
14,986,015,927
IssuesEvent
2021-01-28 20:41:35
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
summary staistics not found in "join atributes by location" function into QGIS 3.10.5 version
Bug Feedback Processing
![Error qgis 2](https://user-images.githubusercontent.com/78165726/106174508-df466d00-6173-11eb-9b75-5dbcafbbc1f7.png) ![Error qgis 1](https://user-images.githubusercontent.com/78165726/106174533-e66d7b00-6173-11eb-9a6a-9189ea040c07.png)
1.0
summary staistics not found in "join atributes by location" function into QGIS 3.10.5 version - ![Error qgis 2](https://user-images.githubusercontent.com/78165726/106174508-df466d00-6173-11eb-9b75-5dbcafbbc1f7.png) ![Error qgis 1](https://user-images.githubusercontent.com/78165726/106174533-e66d7b00-6173-11eb-9a6a-9189ea040c07.png)
process
summary staistics not found in join atributes by location function into qgis version
1
286,243
31,468,093,685
IssuesEvent
2023-08-30 04:52:06
UpendoVentures/generator-upendodnn
https://api.github.com/repos/UpendoVentures/generator-upendodnn
closed
CVE-2018-14042 (Medium) detected in bootstrap-3.3.5.min.js, bootstrap-3.3.5.js - autoclosed
Mend: dependency security vulnerability
## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.5.min.js</b>, <b>bootstrap-3.3.5.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.5.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: /generators/mvc-spa/templates/Scripts/bootstrap/3.3.5/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.5.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js</a></p> <p>Path to vulnerable library: /generators/mvc-spa/templates/Scripts/bootstrap/3.3.5/js/bootstrap.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/UpendoVentures/generator-upendodnn/commit/1c68c9a9ea9734a0a208d80999c86ff1564255dc">1c68c9a9ea9734a0a208d80999c86ff1564255dc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-14042 (Medium) detected in bootstrap-3.3.5.min.js, bootstrap-3.3.5.js - autoclosed - ## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.5.min.js</b>, <b>bootstrap-3.3.5.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.5.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: /generators/mvc-spa/templates/Scripts/bootstrap/3.3.5/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.5.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js</a></p> <p>Path to vulnerable library: /generators/mvc-spa/templates/Scripts/bootstrap/3.3.5/js/bootstrap.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.5.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/UpendoVentures/generator-upendodnn/commit/1c68c9a9ea9734a0a208d80999c86ff1564255dc">1c68c9a9ea9734a0a208d80999c86ff1564255dc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in bootstrap min js bootstrap js autoclosed cve medium severity vulnerability vulnerable libraries bootstrap min js bootstrap js bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library generators mvc spa templates scripts bootstrap js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library generators mvc spa templates scripts bootstrap js bootstrap js dependency hierarchy x bootstrap js vulnerable library found in head commit a href found in base branch master vulnerability details in bootstrap before xss is possible in the data container property of tooltip publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution org webjars npm bootstrap org webjars bootstrap step up your open source security game with mend
0
474,965
13,685,163,147
IssuesEvent
2020-09-30 06:38:21
shahednasser/sbuttons
https://api.github.com/repos/shahednasser/sbuttons
opened
Change "btn-disabled" to "disabled-btn"
Hacktoberfest Priority: Medium buttons enhancement good first issue help wanted up-for-grabs
Now we have a new button type "Disabled" but its class name does not follow the convention we are using for the website. We just need to change the class name from "btn-disabled" to "disabled-btn"
1.0
Change "btn-disabled" to "disabled-btn" - Now we have a new button type "Disabled" but its class name does not follow the convention we are using for the website. We just need to change the class name from "btn-disabled" to "disabled-btn"
non_process
change btn disabled to disabled btn now we have a new button type disabled but its class name does not follow the convention we are using for the website we just need to change the class name from btn disabled to disabled btn
0
20,209
26,796,273,489
IssuesEvent
2023-02-01 12:08:18
threefoldtech/tfchain
https://api.github.com/repos/threefoldtech/tfchain
closed
Add disks field to nodes
process_wontfix type_feature
To accommodate https://github.com/threefoldtech/zos/issues/1830, TF Chain's node object will need a data field that stores a list of disks. Something like: ``` node { disks = [disk, ...] } disk { type = "SSD" or "HDD" size = capacity } ```
1.0
Add disks field to nodes - To accommodate https://github.com/threefoldtech/zos/issues/1830, TF Chain's node object will need a data field that stores a list of disks. Something like: ``` node { disks = [disk, ...] } disk { type = "SSD" or "HDD" size = capacity } ```
process
add disks field to nodes to accommodate tf chain s node object will need a data field that stores a list of disks something like node disks disk type ssd or hdd size capacity
1
21,937
30,446,798,339
IssuesEvent
2023-07-15 19:28:29
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
pyutils 0.0.1a1 has 2 GuardDog issues
guarddog typosquatting silent-process-execution
https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1a1", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils", "silent-process-execution": [ { "location": "pyutils-0.0.1a1/src/pyutils/exec_utils.py:200", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmpezxidgn_/pyutils" } }```
1.0
pyutils 0.0.1a1 has 2 GuardDog issues - https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1a1", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils", "silent-process-execution": [ { "location": "pyutils-0.0.1a1/src/pyutils/exec_utils.py:200", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmpezxidgn_/pyutils" } }```
process
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt python utils pytils silent process execution location pyutils src pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmpezxidgn pyutils
1
14,002
16,773,267,617
IssuesEvent
2021-06-14 17:22:04
googleapis/python-cloud-core
https://api.github.com/repos/googleapis/python-cloud-core
closed
Add 'black' to nox for consistency
type: process
The codebase is not in line with our current standards: it does not use `black` to ensure consistent code style.
1.0
Add 'black' to nox for consistency - The codebase is not in line with our current standards: it does not use `black` to ensure consistent code style.
process
add black to nox for consistency the codebase is not in line with our current standards it does not use black to ensure consistent code style
1
216,360
24,278,525,331
IssuesEvent
2022-09-28 15:28:30
billmcchesney1/davinci
https://api.github.com/repos/billmcchesney1/davinci
reopened
CVE-2020-7656 (Medium) detected in jquery-1.7.1.min.js
security vulnerability
## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/sockjs/examples/express-3.x/index.html</p> <p>Path to vulnerable library: /node_modules/sockjs/examples/express-3.x/index.html,/node_modules/sockjs/examples/echo/index.html,/node_modules/sockjs/examples/multiplex/index.html,/node_modules/sockjs/examples/hapi/html/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/davinci/commit/40c2bf435e91fe54575f47a7d13aef79ce92ae42">40c2bf435e91fe54575f47a7d13aef79ce92ae42</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-19</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p>
True
CVE-2020-7656 (Medium) detected in jquery-1.7.1.min.js - ## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/sockjs/examples/express-3.x/index.html</p> <p>Path to vulnerable library: /node_modules/sockjs/examples/express-3.x/index.html,/node_modules/sockjs/examples/echo/index.html,/node_modules/sockjs/examples/multiplex/index.html,/node_modules/sockjs/examples/hapi/html/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/davinci/commit/40c2bf435e91fe54575f47a7d13aef79ce92ae42">40c2bf435e91fe54575f47a7d13aef79ce92ae42</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-19</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p>
non_process
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file node modules sockjs examples express x index html path to vulnerable library node modules sockjs examples express x index html node modules sockjs examples echo index html node modules sockjs examples multiplex index html node modules sockjs examples hapi html index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery
0
411,397
27,821,180,296
IssuesEvent
2023-03-19 08:54:40
jbloomAus/DecisionTransformerInterpretability
https://api.github.com/repos/jbloomAus/DecisionTransformerInterpretability
opened
Write much better tutorials for the repo.
documentation
I've explained how to use the repo in the docs, but this could be way better. Some examples of pages what might be useful to add to the sphinx docs. 1. A page explaining what Minigrid is, and where to look at each MiniGrid environment (maybe include a table of environments by row with columns for different models (dt, bc, and ppo-traj, ppo standard). 2. A page explaining the difference between the online method (PPO) and offline methods, explaining why we need the trajectories for the latter. 3. A page/diagram explaining how trajectories are generated with the ppo method used to train offline agents. 4. MOST IMPORTANT: pages for stuff you think I won't have thought to explain since I wrote the repo.
1.0
Write much better tutorials for the repo. - I've explained how to use the repo in the docs, but this could be way better. Some examples of pages what might be useful to add to the sphinx docs. 1. A page explaining what Minigrid is, and where to look at each MiniGrid environment (maybe include a table of environments by row with columns for different models (dt, bc, and ppo-traj, ppo standard). 2. A page explaining the difference between the online method (PPO) and offline methods, explaining why we need the trajectories for the latter. 3. A page/diagram explaining how trajectories are generated with the ppo method used to train offline agents. 4. MOST IMPORTANT: pages for stuff you think I won't have thought to explain since I wrote the repo.
non_process
write much better tutorials for the repo i ve explained how to use the repo in the docs but this could be way better some examples of pages what might be useful to add to the sphinx docs a page explaining what minigrid is and where to look at each minigrid environment maybe include a table of environments by row with columns for different models dt bc and ppo traj ppo standard a page explaining the difference between the online method ppo and offline methods explaining why we need the trajectories for the latter a page diagram explaining how trajectories are generated with the ppo method used to train offline agents most important pages for stuff you think i won t have thought to explain since i wrote the repo
0
747,806
26,099,228,408
IssuesEvent
2022-12-27 03:33:36
simonbaird/tiddlyhost
https://api.github.com/repos/simonbaird/tiddlyhost
closed
Site backups and "restore previous version" functionality
priority-high
As I'm now beginning to use TH more seriously to develop stuff, I need a backup feature like in TS. Maybe I'm just not seeing something in front of my nose? IMO the most suitable place for it would be Your sites page or inside the Settings.
1.0
Site backups and "restore previous version" functionality - As I'm now beginning to use TH more seriously to develop stuff, I need a backup feature like in TS. Maybe I'm just not seeing something in front of my nose? IMO the most suitable place for it would be Your sites page or inside the Settings.
non_process
site backups and restore previous version functionality as i m now beginning to use th more seriously to develop stuff i need a backup feature like in ts maybe i m just not seeing something in front of my nose imo the most suitable place for it would be your sites page or inside the settings
0
9,807
12,819,943,821
IssuesEvent
2020-07-06 04:06:54
asyml/forte
https://api.github.com/repos/asyml/forte
closed
StanfordNLP sentence segmenter bug
bug priority: medium topic: processors
While trying to find sentence boundaries, the technique to find the sentence ending can fail. We are using `find` which gives the first occurrence of a word in a sentence. This will definitely fail when there are 2 duplicate words in a sentence. https://github.com/asyml/forte/blob/master/forte/processors/stanfordnlp_processor.py#L72
1.0
StanfordNLP sentence segmenter bug - While trying to find sentence boundaries, the technique to find the sentence ending can fail. We are using `find` which gives the first occurrence of a word in a sentence. This will definitely fail when there are 2 duplicate words in a sentence. https://github.com/asyml/forte/blob/master/forte/processors/stanfordnlp_processor.py#L72
process
stanfordnlp sentence segmenter bug while trying to find sentence boundaries the technique to find the sentence ending can fail we are using find which gives the first occurrence of a word in a sentence this will definitely fail when there are duplicate words in a sentence
1
143,542
11,568,861,940
IssuesEvent
2020-02-20 16:33:42
RPTools/maptool
https://api.github.com/repos/RPTools/maptool
closed
input() PROPS option no longer works.
bug feature macro changes tested
**Describe the bug** I'm getting an error in alpha 6 that I don't get in 1.5.12. Apparently the PROPS option no longer works. Gives null pointer exception error. **To Reproduce** [input("test|one=1;two=2;|Test|props")] **Expected behavior** Should give me fields "one" and "two" with their values. **MapTool Info** - Version: 1.6.0-alpha-6 **Other** I thought alpha 6 and 1.5.12 was the same build, but with html5 stuff. I do recall mention a change to input() to include json options. Maybe that change was implemented in alpha 6 and has a bug in it.
1.0
input() PROPS option no longer works. - **Describe the bug** I'm getting an error in alpha 6 that I don't get in 1.5.12. Apparently the PROPS option no longer works. Gives null pointer exception error. **To Reproduce** [input("test|one=1;two=2;|Test|props")] **Expected behavior** Should give me fields "one" and "two" with their values. **MapTool Info** - Version: 1.6.0-alpha-6 **Other** I thought alpha 6 and 1.5.12 was the same build, but with html5 stuff. I do recall mention a change to input() to include json options. Maybe that change was implemented in alpha 6 and has a bug in it.
non_process
input props option no longer works describe the bug i m getting an error in alpha that i don t get in apparently the props option no longer works gives null pointer exception error to reproduce expected behavior should give me fields one and two with their values maptool info version alpha other i thought alpha and was the same build but with stuff i do recall mention a change to input to include json options maybe that change was implemented in alpha and has a bug in it
0
6,437
9,539,163,714
IssuesEvent
2019-04-30 16:15:45
material-components/material-components-ios
https://api.github.com/repos/material-components/material-components-ios
closed
Internal issue:: b/130723794
type:Process
This was filed as an internal issue. If you are a Googler, please visit [b/130723794](http://b/130723794) for more details. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/130723794](http://b/130723794)
1.0
Internal issue:: b/130723794 - This was filed as an internal issue. If you are a Googler, please visit [b/130723794](http://b/130723794) for more details. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/130723794](http://b/130723794)
process
internal issue b this was filed as an internal issue if you are a googler please visit for more details internal data associated internal bug
1
276,857
30,557,551,707
IssuesEvent
2023-07-20 12:42:53
nagyesta/abort-mission-maven-plugin
https://api.github.com/repos/nagyesta/abort-mission-maven-plugin
closed
jszip-3.7.1.js: 1 vulnerabilities (highest severity is: 7.3)
Mend: dependency security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jszip-3.7.1.js</b></p></summary> <p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js">https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js</a></p> <p>Path to vulnerable library: /target/apidocs/jquery/jszip/dist/jszip.js</p> <p> <p>Found in HEAD commit: <a href="https://github.com/nagyesta/abort-mission-maven-plugin/commit/5c53435c31066f354f194a10a308b5afc3f48883">5c53435c31066f354f194a10a308b5afc3f48883</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jszip version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-48285](https://www.mend.io/vulnerability-database/CVE-2022-48285) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.3 | jszip-3.7.1.js | Direct | jszip - 3.8.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2022-48285</summary> ### Vulnerable Library - <b>jszip-3.7.1.js</b></p> <p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js">https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js</a></p> <p>Path to vulnerable library: /target/apidocs/jquery/jszip/dist/jszip.js</p> <p> Dependency Hierarchy: - :x: **jszip-3.7.1.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nagyesta/abort-mission-maven-plugin/commit/5c53435c31066f354f194a10a308b5afc3f48883">5c53435c31066f354f194a10a308b5afc3f48883</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> loadAsync in JSZip before 3.8.0 allows Directory Traversal via a crafted ZIP archive. Mend Note: Converted from WS-2023-0004, on 2023-02-01. <p>Publish Date: 2023-01-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-48285>CVE-2022-48285</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2023-01-29</p> <p>Fix Resolution: jszip - 3.8.0</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
True
jszip-3.7.1.js: 1 vulnerabilities (highest severity is: 7.3) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jszip-3.7.1.js</b></p></summary> <p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js">https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js</a></p> <p>Path to vulnerable library: /target/apidocs/jquery/jszip/dist/jszip.js</p> <p> <p>Found in HEAD commit: <a href="https://github.com/nagyesta/abort-mission-maven-plugin/commit/5c53435c31066f354f194a10a308b5afc3f48883">5c53435c31066f354f194a10a308b5afc3f48883</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jszip version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2022-48285](https://www.mend.io/vulnerability-database/CVE-2022-48285) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.3 | jszip-3.7.1.js | Direct | jszip - 3.8.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2022-48285</summary> ### Vulnerable Library - <b>jszip-3.7.1.js</b></p> <p>Create, read and edit .zip files with Javascript http://stuartk.com/jszip</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js">https://cdnjs.cloudflare.com/ajax/libs/jszip/3.7.1/jszip.js</a></p> <p>Path to vulnerable library: /target/apidocs/jquery/jszip/dist/jszip.js</p> <p> Dependency Hierarchy: - :x: **jszip-3.7.1.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nagyesta/abort-mission-maven-plugin/commit/5c53435c31066f354f194a10a308b5afc3f48883">5c53435c31066f354f194a10a308b5afc3f48883</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> loadAsync in JSZip before 3.8.0 allows Directory Traversal via a crafted ZIP archive. Mend Note: Converted from WS-2023-0004, on 2023-02-01. <p>Publish Date: 2023-01-29 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-48285>CVE-2022-48285</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.3</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2023-01-29</p> <p>Fix Resolution: jszip - 3.8.0</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
non_process
jszip js vulnerabilities highest severity is vulnerable library jszip js create read and edit zip files with javascript library home page a href path to vulnerable library target apidocs jquery jszip dist jszip js found in head commit a href vulnerabilities cve severity cvss dependency type fixed in jszip version remediation available high jszip js direct jszip details cve vulnerable library jszip js create read and edit zip files with javascript library home page a href path to vulnerable library target apidocs jquery jszip dist jszip js dependency hierarchy x jszip js vulnerable library found in head commit a href found in base branch main vulnerability details loadasync in jszip before allows directory traversal via a crafted zip archive mend note converted from ws on publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution jszip step up your open source security game with mend
0
69,818
13,347,081,462
IssuesEvent
2020-08-29 11:37:34
codeisscience/code-is-science
https://api.github.com/repos/codeisscience/code-is-science
closed
journal page - make it include the header include file [interesting mix of R and hugo templating]
code-task help wanted
So, the site is built largely using [hugo](https://gohugo.io/) with the exception of the journals page, which is rendered using [knitr](https://yihui.name/knitr/) R-related wizardry. @alokpant wisely suggested that the journals page should use the same header, which is entirely sane. Offhand I'm not sure how to mate the hugo template with the R template, or even if it's possible, but it would be nice if we could do so. Relevant files: - R journal page template: https://github.com/yochannah/code-is-science/blob/master/R/template.html - R script that does the rendering: https://github.com/yochannah/code-is-science/edit/master/R/journal_table.Rmd - header file for hugo: https://github.com/yochannah/code-is-science/blob/master/themes/codeisscience/layouts/partials/header.html And you can check out the [wiki](https://github.com/yochannah/code-is-science/wiki) for instructions on how to build the site.
1.0
journal page - make it include the header include file [interesting mix of R and hugo templating] - So, the site is built largely using [hugo](https://gohugo.io/) with the exception of the journals page, which is rendered using [knitr](https://yihui.name/knitr/) R-related wizardry. @alokpant wisely suggested that the journals page should use the same header, which is entirely sane. Offhand I'm not sure how to mate the hugo template with the R template, or even if it's possible, but it would be nice if we could do so. Relevant files: - R journal page template: https://github.com/yochannah/code-is-science/blob/master/R/template.html - R script that does the rendering: https://github.com/yochannah/code-is-science/edit/master/R/journal_table.Rmd - header file for hugo: https://github.com/yochannah/code-is-science/blob/master/themes/codeisscience/layouts/partials/header.html And you can check out the [wiki](https://github.com/yochannah/code-is-science/wiki) for instructions on how to build the site.
non_process
journal page make it include the header include file so the site is built largely using with the exception of the journals page which is rendered using r related wizardry alokpant wisely suggested that the journals page should use the same header which is entirely sane offhand i m not sure how to mate the hugo template with the r template or even if it s possible but it would be nice if we could do so relevant files r journal page template r script that does the rendering header file for hugo and you can check out the for instructions on how to build the site
0
200,840
22,916,001,210
IssuesEvent
2022-07-17 01:05:54
nanopathi/frameworks_av_AOSP10_r33_CVE-2020-0241
https://api.github.com/repos/nanopathi/frameworks_av_AOSP10_r33_CVE-2020-0241
closed
CVE-2020-0160 (High) detected in avandroid-10.0.0_r44 - autoclosed
security vulnerability
## CVE-2020-0160 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r44</b></p></summary> <p> <p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p> <p>Found in HEAD commit: <a href="https://github.com/nanopathi/frameworks_av_AOSP10_r33_CVE-2020-0241/commit/8b5c964757bfe5b3769d24e8697b07fa2770d979">8b5c964757bfe5b3769d24e8697b07fa2770d979</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/media/extractors/mp4/SampleTable.cpp</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In setSyncSampleParams of SampleTable.cpp, there is possible resource exhaustion due to a missing bounds check. This could lead to remote denial of service with no additional execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-124771364 <p>Publish Date: 2020-06-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0160>CVE-2020-0160</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37</a></p> <p>Release Date: 2020-06-11</p> <p>Fix Resolution: android-10.0.0_r37</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-0160 (High) detected in avandroid-10.0.0_r44 - autoclosed - ## CVE-2020-0160 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r44</b></p></summary> <p> <p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p> <p>Found in HEAD commit: <a href="https://github.com/nanopathi/frameworks_av_AOSP10_r33_CVE-2020-0241/commit/8b5c964757bfe5b3769d24e8697b07fa2770d979">8b5c964757bfe5b3769d24e8697b07fa2770d979</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/media/extractors/mp4/SampleTable.cpp</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In setSyncSampleParams of SampleTable.cpp, there is possible resource exhaustion due to a missing bounds check. This could lead to remote denial of service with no additional execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-124771364 <p>Publish Date: 2020-06-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0160>CVE-2020-0160</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37</a></p> <p>Release Date: 2020-06-11</p> <p>Fix Resolution: android-10.0.0_r37</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in avandroid autoclosed cve high severity vulnerability vulnerable library avandroid library home page a href found in head commit a href found in base branch master vulnerable source files media extractors sampletable cpp vulnerability details in setsyncsampleparams of sampletable cpp there is possible resource exhaustion due to a missing bounds check this could lead to remote denial of service with no additional execution privileges needed user interaction is needed for exploitation product androidversions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with whitesource
0
725,756
24,974,793,086
IssuesEvent
2022-11-02 06:35:51
matrixorigin/matrixone
https://api.github.com/repos/matrixorigin/matrixone
opened
[Enhancement]: error handling on reading and writing io enties
kind/enhancement priority/p0.5
### Is there an existing issue for enhancement? - [X] I have checked the existing issues. ### What would you like to be added ? ```Markdown error handling on reading and writing io entries ``` ### Why is this needed ? 1. Timeout 2. Duplicated 3. Not found 4. Unexpected ### Additional information _No response_
1.0
[Enhancement]: error handling on reading and writing io enties - ### Is there an existing issue for enhancement? - [X] I have checked the existing issues. ### What would you like to be added ? ```Markdown error handling on reading and writing io entries ``` ### Why is this needed ? 1. Timeout 2. Duplicated 3. Not found 4. Unexpected ### Additional information _No response_
non_process
error handling on reading and writing io enties is there an existing issue for enhancement i have checked the existing issues what would you like to be added markdown error handling on reading and writing io entries why is this needed timeout duplicated not found unexpected additional information no response
0
239,352
19,848,909,853
IssuesEvent
2022-01-21 10:03:13
zephyrproject-rtos/test_results
https://api.github.com/repos/zephyrproject-rtos/test_results
closed
tests-ci : can: isotp: conformance test failed
bug area: Tests
**Describe the bug** conformance test is failed on v2.7.99-3293-g93b0ea978293 on mimxrt1060_evk see logs for details **To Reproduce** 1. ``` scripts/twister --device-testing --device-serial /dev/ttyACM0 -p mimxrt1060_evk --sub-test can.isotp ``` 2. See error **Expected behavior** test pass **Impact** **Logs and console output** ``` *** Booting Zephyr OS build v2.7.99-3293-g93b0ea978293 *** Running test suite isotp_conformance =================================================================== START - test_send_sf PASS - test_send_sf in 0.202 seconds =================================================================== START - test_receive_sf PASS - test_receive_sf in 0.3 seconds =================================================================== START - test_send_sf_ext W: Unhandled error/status (status 0x00000001, result = 0x00000001 Assertion failed at WEST_TOPDIR/zephyr/tests/subsys/canbus/isotp/conformance/src/main.c:261: check_frame_series: (ret not equal to 0) Timeout waiting for msg nr 0. ret: -11 FAIL - test_send_sf_ext in 0.522 seconds =================================================================== START - test_receive_sf_ext PASS - test_receive_sf_ext in 0.3 seconds =================================================================== START - test_send_sf_fixed PASS - test_send_sf_fixed in 0.202 seconds =================================================================== START - test_receive_sf_fixed PASS - test_receive_sf_fixed in 0.205 seconds =================================================================== START - test_send_data PASS - test_send_data in 0.437 seconds =================================================================== START - test_send_data_blocks PASS - test_send_data_blocks in 0.990 seconds =================================================================== START - test_receive_data PASS - test_receive_data in 0.288 seconds =================================================================== START - test_receive_data_blocks PASS - test_receive_data_blocks in 1.142 seconds =================================================================== START - test_send_timeouts E: Reception of next FC has timed out E: Reception of next FC has timed out E: Reception of next FC has timed out PASS - test_send_timeouts in 3.812 seconds =================================================================== START - test_receive_timeouts E: Timeout while waiting for CF PASS - test_receive_timeouts in 1.5 seconds =================================================================== START - test_stmin PASS - test_stmin in 0.262 seconds =================================================================== START - test_receiver_fc_errors E: Sequence number missmatch I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore PASS - test_receiver_fc_errors in 0.429 seconds =================================================================== START - test_sender_fc_errors E: Pkt length is 5118 but buffer has only 768 bytes E: Got overflow FC frame E: Got overflow FC frame I: Got to many wait frames PASS - test_sender_fc_errors in 0.625 seconds =================================================================== Test suite isotp_conformance failed. =================================================================== PROJECT EXECUTION FAILED ``` **Environment (please complete the following information):** - OS: (e.g. Linux ) - Toolchain (e.g Zephyr SDK) - Commit SHA or Version used: v2.7.99-3293-g93b0ea978293
1.0
tests-ci : can: isotp: conformance test failed - **Describe the bug** conformance test is failed on v2.7.99-3293-g93b0ea978293 on mimxrt1060_evk see logs for details **To Reproduce** 1. ``` scripts/twister --device-testing --device-serial /dev/ttyACM0 -p mimxrt1060_evk --sub-test can.isotp ``` 2. See error **Expected behavior** test pass **Impact** **Logs and console output** ``` *** Booting Zephyr OS build v2.7.99-3293-g93b0ea978293 *** Running test suite isotp_conformance =================================================================== START - test_send_sf PASS - test_send_sf in 0.202 seconds =================================================================== START - test_receive_sf PASS - test_receive_sf in 0.3 seconds =================================================================== START - test_send_sf_ext W: Unhandled error/status (status 0x00000001, result = 0x00000001 Assertion failed at WEST_TOPDIR/zephyr/tests/subsys/canbus/isotp/conformance/src/main.c:261: check_frame_series: (ret not equal to 0) Timeout waiting for msg nr 0. ret: -11 FAIL - test_send_sf_ext in 0.522 seconds =================================================================== START - test_receive_sf_ext PASS - test_receive_sf_ext in 0.3 seconds =================================================================== START - test_send_sf_fixed PASS - test_send_sf_fixed in 0.202 seconds =================================================================== START - test_receive_sf_fixed PASS - test_receive_sf_fixed in 0.205 seconds =================================================================== START - test_send_data PASS - test_send_data in 0.437 seconds =================================================================== START - test_send_data_blocks PASS - test_send_data_blocks in 0.990 seconds =================================================================== START - test_receive_data PASS - test_receive_data in 0.288 seconds =================================================================== START - test_receive_data_blocks PASS - test_receive_data_blocks in 1.142 seconds =================================================================== START - test_send_timeouts E: Reception of next FC has timed out E: Reception of next FC has timed out E: Reception of next FC has timed out PASS - test_send_timeouts in 3.812 seconds =================================================================== START - test_receive_timeouts E: Timeout while waiting for CF PASS - test_receive_timeouts in 1.5 seconds =================================================================== START - test_stmin PASS - test_stmin in 0.262 seconds =================================================================== START - test_receiver_fc_errors E: Sequence number missmatch I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore I: Got unexpected frame. Ignore PASS - test_receiver_fc_errors in 0.429 seconds =================================================================== START - test_sender_fc_errors E: Pkt length is 5118 but buffer has only 768 bytes E: Got overflow FC frame E: Got overflow FC frame I: Got to many wait frames PASS - test_sender_fc_errors in 0.625 seconds =================================================================== Test suite isotp_conformance failed. =================================================================== PROJECT EXECUTION FAILED ``` **Environment (please complete the following information):** - OS: (e.g. Linux ) - Toolchain (e.g Zephyr SDK) - Commit SHA or Version used: v2.7.99-3293-g93b0ea978293
non_process
tests ci can isotp conformance test failed describe the bug conformance test is failed on on evk see logs for details to reproduce scripts twister device testing device serial dev p evk sub test can isotp see error expected behavior test pass impact logs and console output booting zephyr os build running test suite isotp conformance start test send sf pass test send sf in seconds start test receive sf pass test receive sf in seconds start test send sf ext w unhandled error status status result assertion failed at west topdir zephyr tests subsys canbus isotp conformance src main c check frame series ret not equal to timeout waiting for msg nr ret fail test send sf ext in seconds start test receive sf ext pass test receive sf ext in seconds start test send sf fixed pass test send sf fixed in seconds start test receive sf fixed pass test receive sf fixed in seconds start test send data pass test send data in seconds start test send data blocks pass test send data blocks in seconds start test receive data pass test receive data in seconds start test receive data blocks pass test receive data blocks in seconds start test send timeouts e reception of next fc has timed out e reception of next fc has timed out e reception of next fc has timed out pass test send timeouts in seconds start test receive timeouts e timeout while waiting for cf pass test receive timeouts in seconds start test stmin pass test stmin in seconds start test receiver fc errors e sequence number missmatch i got unexpected frame ignore i got unexpected frame ignore i got unexpected frame ignore i got unexpected frame ignore i got unexpected frame ignore i got unexpected frame ignore pass test receiver fc errors in seconds start test sender fc errors e pkt length is but buffer has only bytes e got overflow fc frame e got overflow fc frame i got to many wait frames pass test sender fc errors in seconds test suite isotp conformance failed project execution failed environment please complete the following information os e g linux toolchain e g zephyr sdk commit sha or version used
0
9,032
12,129,836,723
IssuesEvent
2020-04-22 23:42:22
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
remove gcp-devrel-py-tools from datastore/cloud-client/requirements-test.txt
priority: p2 remove-gcp-devrel-py-tools type: process
remove gcp-devrel-py-tools from datastore/cloud-client/requirements-test.txt
1.0
remove gcp-devrel-py-tools from datastore/cloud-client/requirements-test.txt - remove gcp-devrel-py-tools from datastore/cloud-client/requirements-test.txt
process
remove gcp devrel py tools from datastore cloud client requirements test txt remove gcp devrel py tools from datastore cloud client requirements test txt
1
10,138
13,044,162,437
IssuesEvent
2020-07-29 03:47:32
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `JsonMergePreserveSig` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `JsonMergePreserveSig` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @mapleFU ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `JsonMergePreserveSig` from TiDB - ## Description Port the scalar function `JsonMergePreserveSig` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @mapleFU ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function jsonmergepreservesig from tidb description port the scalar function jsonmergepreservesig from tidb to coprocessor score mentor s maplefu recommended skills rust programming learning materials already implemented expressions ported from tidb
1
260,827
19,686,807,026
IssuesEvent
2022-01-11 23:25:18
corretto/amazon-corretto-crypto-provider
https://api.github.com/repos/corretto/amazon-corretto-crypto-provider
closed
Document all system properties
Documentation
All system properties defined by ACCP need to be documented. This must include include the target audience for each property and how long we commit to supporting it.
1.0
Document all system properties - All system properties defined by ACCP need to be documented. This must include include the target audience for each property and how long we commit to supporting it.
non_process
document all system properties all system properties defined by accp need to be documented this must include include the target audience for each property and how long we commit to supporting it
0
337,887
30,269,726,165
IssuesEvent
2023-07-07 14:28:30
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
ccl/changefeedccl: TestChangefeedNemeses failed
C-test-failure O-robot A-cdc branch-master T-cdc
ccl/changefeedccl.TestChangefeedNemeses [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/9120744?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/9120744?buildTab=artifacts#/) on master @ [8833248e0cb5d93d3ea30936e86702802b91dc7c](https://github.com/cockroachdb/cockroach/commits/8833248e0cb5d93d3ea30936e86702802b91dc7c): ``` === RUN TestChangefeedNemeses test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/d2b64858287e5594aed9da7514006ac9/logTestChangefeedNemeses896304131 test_log_scope.go:79: use -show-logs to present logs inline === CONT TestChangefeedNemeses nemeses_test.go:57: Found violation of CDC's guarantees: [{ERROR 1679101066345913000 116311 ccl/changefeedccl/event_processing.go 409 cdc ux violation: detected timestamp 1679101064.636007454,0 that is less than or equal to the local frontier 1679101064.715575368,0. n1,job=848791680717258753 514 false DEV 0 0 0 1}] panic.go:522: -- test log scope end -- test logs left over in: /artifacts/tmp/_tmp/d2b64858287e5594aed9da7514006ac9/logTestChangefeedNemeses896304131 --- FAIL: TestChangefeedNemeses (8.25s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/cdc <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestChangefeedNemeses.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-25583
1.0
ccl/changefeedccl: TestChangefeedNemeses failed - ccl/changefeedccl.TestChangefeedNemeses [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/9120744?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Ci_TestsAwsLinuxArm64_UnitTests/9120744?buildTab=artifacts#/) on master @ [8833248e0cb5d93d3ea30936e86702802b91dc7c](https://github.com/cockroachdb/cockroach/commits/8833248e0cb5d93d3ea30936e86702802b91dc7c): ``` === RUN TestChangefeedNemeses test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/d2b64858287e5594aed9da7514006ac9/logTestChangefeedNemeses896304131 test_log_scope.go:79: use -show-logs to present logs inline === CONT TestChangefeedNemeses nemeses_test.go:57: Found violation of CDC's guarantees: [{ERROR 1679101066345913000 116311 ccl/changefeedccl/event_processing.go 409 cdc ux violation: detected timestamp 1679101064.636007454,0 that is less than or equal to the local frontier 1679101064.715575368,0. n1,job=848791680717258753 514 false DEV 0 0 0 1}] panic.go:522: -- test log scope end -- test logs left over in: /artifacts/tmp/_tmp/d2b64858287e5594aed9da7514006ac9/logTestChangefeedNemeses896304131 --- FAIL: TestChangefeedNemeses (8.25s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/cdc <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestChangefeedNemeses.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-25583
non_process
ccl changefeedccl testchangefeednemeses failed ccl changefeedccl testchangefeednemeses with on master run testchangefeednemeses test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline cont testchangefeednemeses nemeses test go found violation of cdc s guarantees panic go test log scope end test logs left over in artifacts tmp tmp fail testchangefeednemeses help see also cc cockroachdb cdc jira issue crdb
0
69,334
8,393,888,115
IssuesEvent
2018-10-09 22:00:08
LetsEatCo/Restaurants
https://api.github.com/repos/LetsEatCo/Restaurants
opened
✨ Add Kiosk
priority: high 🔥 scope: back-office scope: design scope: functional type: feature ✨
**Is your feature request related to a problem ? Please describe.** A logged in store should be able to add a kiosk **Describe the solution you'd like** CS **Describe alternatives you've considered** None **Additional context** None
1.0
✨ Add Kiosk - **Is your feature request related to a problem ? Please describe.** A logged in store should be able to add a kiosk **Describe the solution you'd like** CS **Describe alternatives you've considered** None **Additional context** None
non_process
✨ add kiosk is your feature request related to a problem please describe a logged in store should be able to add a kiosk describe the solution you d like cs describe alternatives you ve considered none additional context none
0
190,625
22,106,875,277
IssuesEvent
2022-06-01 17:40:52
BCDevOps/developer-experience
https://api.github.com/repos/BCDevOps/developer-experience
opened
KeyCloak CSS STRA Review and Updates
security
- [x] Review STRA completed by Bruce and - [x] fix-up/update - [ ] Complete SoAR - [ ] Review w/ KeyCloak team
True
KeyCloak CSS STRA Review and Updates - - [x] Review STRA completed by Bruce and - [x] fix-up/update - [ ] Complete SoAR - [ ] Review w/ KeyCloak team
non_process
keycloak css stra review and updates review stra completed by bruce and fix up update complete soar review w keycloak team
0
283,787
24,562,799,560
IssuesEvent
2022-10-12 22:11:07
yugabyte/yugabyte-db
https://api.github.com/repos/yugabyte/yugabyte-db
opened
[YSQL] flaky test: undefined behavior in PgLibPqTest.DBCatalogVersion
area/ysql kind/failing-test priority/high status/awaiting-triage
### Description https://detective-gcp.dev.yugabyte.com/stability/test?branch=master&build_type=all&class=PgLibPqTest&fail_tag=undefined_behavior&name=DBCatalogVersion&platform=linux ``` [ts-1] ../../../../../../../src/postgres/src/backend/catalog/yb_catalog/yb_catalog_version.c:472:5: runtime error: null pointer passed as argument 2, which is declared to never be null [ts-1] /usr/include/stdlib.h:819:6: note: nonnull attribute specified here [ts-1] #0 0xa61a7a in YbGetTserverCatalogVersion ${YB_SRC_ROOT}/src/postgres/src/backend/catalog/yb_catalog/../../../../../../../src/postgres/src/backend/catalog/yb_catalog/yb_catalog_version.c:470:36 [ts-1] #1 0x1795030 in YbResolveDBTserverCatalogVersion ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c:1170:33 [ts-1] #2 0x1791a34 in InitPostgresImpl ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c [ts-1] #3 0x17910c5 in InitPostgres ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c:1201:3 [ts-1] #4 0x1351e36 in PostgresMain ${YB_SRC_ROOT}/src/postgres/src/backend/tcop/../../../../../../src/postgres/src/backend/tcop/postgres.c:4848:2 [ts-1] #5 0x1197860 in BackendRun ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:4576:2 [ts-1] #6 0x1195a09 in BackendStartup ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:4214:3 [ts-1] #7 0x1195a09 in ServerLoop ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:1768:7 [ts-1] #8 0x118dd7e in PostmasterMain ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:1424:11 [ts-1] #9 0xf604c0 in PostgresServerProcessMain ${YB_SRC_ROOT}/src/postgres/src/backend/main/../../../../../../src/postgres/src/backend/main/main.c:234:3 [ts-1] #10 0xf60901 in main (${BUILD_ROOT}/postgres/bin/postgres+0xf60901) [ts-1] #11 0x7f0d67824cf2 in __libc_start_main (/lib64/libc.so.6+0x3acf2) [ts-1] #12 0x77d6cd in _start (${BUILD_ROOT}/postgres/bin/postgres+0x77d6cd) ```
1.0
[YSQL] flaky test: undefined behavior in PgLibPqTest.DBCatalogVersion - ### Description https://detective-gcp.dev.yugabyte.com/stability/test?branch=master&build_type=all&class=PgLibPqTest&fail_tag=undefined_behavior&name=DBCatalogVersion&platform=linux ``` [ts-1] ../../../../../../../src/postgres/src/backend/catalog/yb_catalog/yb_catalog_version.c:472:5: runtime error: null pointer passed as argument 2, which is declared to never be null [ts-1] /usr/include/stdlib.h:819:6: note: nonnull attribute specified here [ts-1] #0 0xa61a7a in YbGetTserverCatalogVersion ${YB_SRC_ROOT}/src/postgres/src/backend/catalog/yb_catalog/../../../../../../../src/postgres/src/backend/catalog/yb_catalog/yb_catalog_version.c:470:36 [ts-1] #1 0x1795030 in YbResolveDBTserverCatalogVersion ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c:1170:33 [ts-1] #2 0x1791a34 in InitPostgresImpl ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c [ts-1] #3 0x17910c5 in InitPostgres ${YB_SRC_ROOT}/src/postgres/src/backend/utils/init/../../../../../../../src/postgres/src/backend/utils/init/postinit.c:1201:3 [ts-1] #4 0x1351e36 in PostgresMain ${YB_SRC_ROOT}/src/postgres/src/backend/tcop/../../../../../../src/postgres/src/backend/tcop/postgres.c:4848:2 [ts-1] #5 0x1197860 in BackendRun ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:4576:2 [ts-1] #6 0x1195a09 in BackendStartup ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:4214:3 [ts-1] #7 0x1195a09 in ServerLoop ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:1768:7 [ts-1] #8 0x118dd7e in PostmasterMain ${YB_SRC_ROOT}/src/postgres/src/backend/postmaster/../../../../../../src/postgres/src/backend/postmaster/postmaster.c:1424:11 [ts-1] #9 0xf604c0 in PostgresServerProcessMain ${YB_SRC_ROOT}/src/postgres/src/backend/main/../../../../../../src/postgres/src/backend/main/main.c:234:3 [ts-1] #10 0xf60901 in main (${BUILD_ROOT}/postgres/bin/postgres+0xf60901) [ts-1] #11 0x7f0d67824cf2 in __libc_start_main (/lib64/libc.so.6+0x3acf2) [ts-1] #12 0x77d6cd in _start (${BUILD_ROOT}/postgres/bin/postgres+0x77d6cd) ```
non_process
flaky test undefined behavior in pglibpqtest dbcatalogversion description src postgres src backend catalog yb catalog yb catalog version c runtime error null pointer passed as argument which is declared to never be null usr include stdlib h note nonnull attribute specified here in ybgettservercatalogversion yb src root src postgres src backend catalog yb catalog src postgres src backend catalog yb catalog yb catalog version c in ybresolvedbtservercatalogversion yb src root src postgres src backend utils init src postgres src backend utils init postinit c in initpostgresimpl yb src root src postgres src backend utils init src postgres src backend utils init postinit c in initpostgres yb src root src postgres src backend utils init src postgres src backend utils init postinit c in postgresmain yb src root src postgres src backend tcop src postgres src backend tcop postgres c in backendrun yb src root src postgres src backend postmaster src postgres src backend postmaster postmaster c in backendstartup yb src root src postgres src backend postmaster src postgres src backend postmaster postmaster c in serverloop yb src root src postgres src backend postmaster src postgres src backend postmaster postmaster c in postmastermain yb src root src postgres src backend postmaster src postgres src backend postmaster postmaster c in postgresserverprocessmain yb src root src postgres src backend main src postgres src backend main main c in main build root postgres bin postgres in libc start main libc so in start build root postgres bin postgres
0
22,564
7,190,669,633
IssuesEvent
2018-02-02 18:05:43
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
Useful information for geth support.
build-testing status-inprocess type-enhancement
h t t p s : / / g i t h u . b.com/ethereum/go-ethereum/wiki/Management-APIs
1.0
Useful information for geth support. - h t t p s : / / g i t h u . b.com/ethereum/go-ethereum/wiki/Management-APIs
non_process
useful information for geth support h t t p s g i t h u b com ethereum go ethereum wiki management apis
0
38,230
2,842,441,988
IssuesEvent
2015-05-28 09:21:03
hydrosolutions/imomo-hydromet-client
https://api.github.com/repos/hydrosolutions/imomo-hydromet-client
closed
Make it clear what a station being ready for processing means
enhancement Priority low question
When using a station that has not all the required initial data, the user gets an error message but it is not clear what is needed from the user.
1.0
Make it clear what a station being ready for processing means - When using a station that has not all the required initial data, the user gets an error message but it is not clear what is needed from the user.
non_process
make it clear what a station being ready for processing means when using a station that has not all the required initial data the user gets an error message but it is not clear what is needed from the user
0
365,428
25,535,396,607
IssuesEvent
2022-11-29 11:34:51
Quozul/template
https://api.github.com/repos/Quozul/template
closed
Add security policy
documentation
Using the following guide, please add a security policy to the repository. https://docs.github.com/en/code-security/getting-started/adding-a-security-policy-to-your-repository
1.0
Add security policy - Using the following guide, please add a security policy to the repository. https://docs.github.com/en/code-security/getting-started/adding-a-security-policy-to-your-repository
non_process
add security policy using the following guide please add a security policy to the repository
0
16,847
11,424,215,624
IssuesEvent
2020-02-03 17:17:19
openstreetmap/iD
https://api.github.com/repos/openstreetmap/iD
closed
Enable Issues section for multiple selected features
usability validation
Continuing our multiselection editing work in #7276, we should also try to enable the Issues section. We'll have to decide if it makes more sense to show every issue with every selected feature (which could be loads), or just those issues shared between features in the selection. There's also the problem of quick fixes that depend on the selected feature, such as "add a bridge".
True
Enable Issues section for multiple selected features - Continuing our multiselection editing work in #7276, we should also try to enable the Issues section. We'll have to decide if it makes more sense to show every issue with every selected feature (which could be loads), or just those issues shared between features in the selection. There's also the problem of quick fixes that depend on the selected feature, such as "add a bridge".
non_process
enable issues section for multiple selected features continuing our multiselection editing work in we should also try to enable the issues section we ll have to decide if it makes more sense to show every issue with every selected feature which could be loads or just those issues shared between features in the selection there s also the problem of quick fixes that depend on the selected feature such as add a bridge
0
7,918
11,097,438,876
IssuesEvent
2019-12-16 13:22:30
prisma/photonjs
https://api.github.com/repos/prisma/photonjs
closed
Document error consumption
kind/docs process/candidate
``` try { user = await ctx.photon.users.create({ data: { name, email, password: hashedPassword, }, }) console.log("====", user) debugger } catch (error){ throw new Error('Error during registration. Please try later') } ``` The email field is unique. When I'm trying to create a user with an existing email it throws the error. Is there a way to identify specific error types that, email was duplicated? Or I have to check in advance if a user with these emails exists or not?
1.0
Document error consumption - ``` try { user = await ctx.photon.users.create({ data: { name, email, password: hashedPassword, }, }) console.log("====", user) debugger } catch (error){ throw new Error('Error during registration. Please try later') } ``` The email field is unique. When I'm trying to create a user with an existing email it throws the error. Is there a way to identify specific error types that, email was duplicated? Or I have to check in advance if a user with these emails exists or not?
process
document error consumption try user await ctx photon users create data name email password hashedpassword console log user debugger catch error throw new error error during registration please try later the email field is unique when i m trying to create a user with an existing email it throws the error is there a way to identify specific error types that email was duplicated or i have to check in advance if a user with these emails exists or not
1
4,750
7,610,710,972
IssuesEvent
2018-05-01 09:56:26
TeamPotry/tutorial_text
https://api.github.com/repos/TeamPotry/tutorial_text
opened
튜토리얼 텍스트의 구조 변경
enhancement in_process
- [ ] **기존의 하나의 컨픽만 사용하던 로직을 변경** `configs/tutorial_text` 폴더 내의 모든 텍스트 파일을 읽고 파일 이름을 광역 `stringmap`에 저장 기존에 테스트 메뉴에 사용됬던 컨픽은 그냥 테스트용으로 존재하도록 변경 - [ ] **`TTextKeyValue (KeyValues 상속)` 설계 및 기존 `KeyValues` 관련 스톡 이전** - [ ] **쿠키 구조 변경** 현재: `messageId` 기준 구조 변경: `filename_messageId` 기준
1.0
튜토리얼 텍스트의 구조 변경 - - [ ] **기존의 하나의 컨픽만 사용하던 로직을 변경** `configs/tutorial_text` 폴더 내의 모든 텍스트 파일을 읽고 파일 이름을 광역 `stringmap`에 저장 기존에 테스트 메뉴에 사용됬던 컨픽은 그냥 테스트용으로 존재하도록 변경 - [ ] **`TTextKeyValue (KeyValues 상속)` 설계 및 기존 `KeyValues` 관련 스톡 이전** - [ ] **쿠키 구조 변경** 현재: `messageId` 기준 구조 변경: `filename_messageId` 기준
process
튜토리얼 텍스트의 구조 변경 기존의 하나의 컨픽만 사용하던 로직을 변경 configs tutorial text 폴더 내의 모든 텍스트 파일을 읽고 파일 이름을 광역 stringmap 에 저장 기존에 테스트 메뉴에 사용됬던 컨픽은 그냥 테스트용으로 존재하도록 변경 ttextkeyvalue keyvalues 상속 설계 및 기존 keyvalues 관련 스톡 이전 쿠키 구조 변경 현재 messageid 기준 구조 변경 filename messageid 기준
1
105,490
13,188,418,826
IssuesEvent
2020-08-13 06:24:20
tektoncd/triggers
https://api.github.com/repos/tektoncd/triggers
closed
Should Trigger be a separate Custom Resource
kind/design lifecycle/stale maybe-next-milestone
# Expected Behavior Trigger inside Eventlistener feels like an independent cr. It has template ref, binding, param, and name. Latter of which seems to be better off with having an ObjectMeta based on the usecase for which it is added. If listener have just ref to trigger and trigger is a separate cr, then various operations on triggers can be easily done using verbs. # Actual Behavior At present, if we have to change trigger, then we have to loop through triggers and then patch event listener. /cc @dibyom @ncskier @vtereso @vincent-pli @iancoffey @bobcatfish
1.0
Should Trigger be a separate Custom Resource - # Expected Behavior Trigger inside Eventlistener feels like an independent cr. It has template ref, binding, param, and name. Latter of which seems to be better off with having an ObjectMeta based on the usecase for which it is added. If listener have just ref to trigger and trigger is a separate cr, then various operations on triggers can be easily done using verbs. # Actual Behavior At present, if we have to change trigger, then we have to loop through triggers and then patch event listener. /cc @dibyom @ncskier @vtereso @vincent-pli @iancoffey @bobcatfish
non_process
should trigger be a separate custom resource expected behavior trigger inside eventlistener feels like an independent cr it has template ref binding param and name latter of which seems to be better off with having an objectmeta based on the usecase for which it is added if listener have just ref to trigger and trigger is a separate cr then various operations on triggers can be easily done using verbs actual behavior at present if we have to change trigger then we have to loop through triggers and then patch event listener cc dibyom ncskier vtereso vincent pli iancoffey bobcatfish
0
373,951
11,053,188,766
IssuesEvent
2019-12-10 10:51:07
eclipse/codewind
https://api.github.com/repos/eclipse/codewind
closed
Maximum call stack size exceeded in PFE logs on Codewind
area/portal kind/bug priority/stopship
I'm seeing the following in the PFE logs (latest images) when running on Che just after a docker build and push finishes for Generic docker projects: ``` [26/11/19 22:11:31 FileWatcher.js] [ERROR] RangeError: Maximum call stack size exceeded at Function.[Symbol.hasInstance] (<anonymous>) at Function.isBuffer (buffer.js:430:12) at hasBinary (/portal/node_modules/has-binary2/index.js:44:66) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) ``` It also appears to cause PFE to hang, as it prevented the app and build status from updating. Despite the build finishing and the app deploying, it was stuck in `Building - Creating Image`.
1.0
Maximum call stack size exceeded in PFE logs on Codewind - I'm seeing the following in the PFE logs (latest images) when running on Che just after a docker build and push finishes for Generic docker projects: ``` [26/11/19 22:11:31 FileWatcher.js] [ERROR] RangeError: Maximum call stack size exceeded at Function.[Symbol.hasInstance] (<anonymous>) at Function.isBuffer (buffer.js:430:12) at hasBinary (/portal/node_modules/has-binary2/index.js:44:66) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) at hasBinary (/portal/node_modules/has-binary2/index.js:58:59) ``` It also appears to cause PFE to hang, as it prevented the app and build status from updating. Despite the build finishing and the app deploying, it was stuck in `Building - Creating Image`.
non_process
maximum call stack size exceeded in pfe logs on codewind i m seeing the following in the pfe logs latest images when running on che just after a docker build and push finishes for generic docker projects rangeerror maximum call stack size exceeded at function at function isbuffer buffer js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js at hasbinary portal node modules has index js it also appears to cause pfe to hang as it prevented the app and build status from updating despite the build finishing and the app deploying it was stuck in building creating image
0
4,219
7,179,151,151
IssuesEvent
2018-01-31 18:43:50
syndesisio/syndesis
https://api.github.com/repos/syndesisio/syndesis
closed
Code generator can't deal with plural type names
cat/bug cat/process module/ui prio/p2
|<img src="https://avatars2.githubusercontent.com/u/351660?v=4" valign="middle" width="22px"></img> @gashcrumb | [2017-07-25](https://github.com/syndesisio/syndesis-ui/issues/657) | bug, dev process, Priority - Low | |-|-|-| When running `yarn generate` against the current swagger that contains `FilterOptions` we get: ``` ERROR in C:/Users/gashcrumb/GitHub/syndesis-ui/src/app/model.ts (216,18): Duplicate identifier 'FilterOptions'. ERROR in C:/Users/gashcrumb/GitHub/syndesis-ui/src/app/model.ts (222,13): Duplicate identifier 'FilterOptions'. ``` As the class is plural but the generator also creates an array with the plural name. This can be fixed manually but it'd be nice if the generator could handle this case.
1.0
Code generator can't deal with plural type names - |<img src="https://avatars2.githubusercontent.com/u/351660?v=4" valign="middle" width="22px"></img> @gashcrumb | [2017-07-25](https://github.com/syndesisio/syndesis-ui/issues/657) | bug, dev process, Priority - Low | |-|-|-| When running `yarn generate` against the current swagger that contains `FilterOptions` we get: ``` ERROR in C:/Users/gashcrumb/GitHub/syndesis-ui/src/app/model.ts (216,18): Duplicate identifier 'FilterOptions'. ERROR in C:/Users/gashcrumb/GitHub/syndesis-ui/src/app/model.ts (222,13): Duplicate identifier 'FilterOptions'. ``` As the class is plural but the generator also creates an array with the plural name. This can be fixed manually but it'd be nice if the generator could handle this case.
process
code generator can t deal with plural type names gashcrumb bug dev process priority low when running yarn generate against the current swagger that contains filteroptions we get error in c users gashcrumb github syndesis ui src app model ts duplicate identifier filteroptions error in c users gashcrumb github syndesis ui src app model ts duplicate identifier filteroptions as the class is plural but the generator also creates an array with the plural name this can be fixed manually but it d be nice if the generator could handle this case
1
174,473
14,483,822,239
IssuesEvent
2020-12-10 15:36:26
krosmaker/krosmaker
https://api.github.com/repos/krosmaker/krosmaker
closed
Document PDF export
documentation
Add a printable PDF export guide. Include assets necessary to prepare the PDFs.
1.0
Document PDF export - Add a printable PDF export guide. Include assets necessary to prepare the PDFs.
non_process
document pdf export add a printable pdf export guide include assets necessary to prepare the pdfs
0
65,983
19,846,070,635
IssuesEvent
2022-01-21 06:31:41
vector-im/element-ios
https://api.github.com/repos/vector-im/element-ios
opened
Cannot share a video from iOS
T-Defect
### Steps to reproduce Steps to reproduce: 1. Have a video present in chat with someone 2. Click on the video and hold 3. Click more in the bottom left corner 4. Select share and select a user to send it to someone ### Outcome Expected the video to be shared to the other user. Actual result: The process gets stuck in the sending phase ![Screenshot 2022-01-21 at 11 47 54 AM](https://user-images.githubusercontent.com/29797823/150477900-04b1eda5-1c1a-49e3-8ff9-8feae950b518.png) ### Your phone model iPad 7th Generation ### Operating system version iOS 14.7.1 ### Application version Element 1.16.12 ### Homeserver _No response_ ### Will you send logs? No
1.0
Cannot share a video from iOS - ### Steps to reproduce Steps to reproduce: 1. Have a video present in chat with someone 2. Click on the video and hold 3. Click more in the bottom left corner 4. Select share and select a user to send it to someone ### Outcome Expected the video to be shared to the other user. Actual result: The process gets stuck in the sending phase ![Screenshot 2022-01-21 at 11 47 54 AM](https://user-images.githubusercontent.com/29797823/150477900-04b1eda5-1c1a-49e3-8ff9-8feae950b518.png) ### Your phone model iPad 7th Generation ### Operating system version iOS 14.7.1 ### Application version Element 1.16.12 ### Homeserver _No response_ ### Will you send logs? No
non_process
cannot share a video from ios steps to reproduce steps to reproduce have a video present in chat with someone click on the video and hold click more in the bottom left corner select share and select a user to send it to someone outcome expected the video to be shared to the other user actual result the process gets stuck in the sending phase your phone model ipad generation operating system version ios application version element homeserver no response will you send logs no
0
108,566
11,595,661,636
IssuesEvent
2020-02-24 17:25:04
force-h2020/force-bdss-plugin-nevergrad
https://api.github.com/repos/force-h2020/force-bdss-plugin-nevergrad
closed
Documentation to be updated
documentation
We would like to provide the following documentation for release 0.1.0: - [x] A README similar to other Force repos - [x] Documentation / guide imported from #6
1.0
Documentation to be updated - We would like to provide the following documentation for release 0.1.0: - [x] A README similar to other Force repos - [x] Documentation / guide imported from #6
non_process
documentation to be updated we would like to provide the following documentation for release a readme similar to other force repos documentation guide imported from
0
243,948
20,598,215,242
IssuesEvent
2022-03-05 21:13:51
disco-lang/disco
https://api.github.com/repos/disco-lang/disco
opened
Ability to put tests after a definition
C-Moderate Effort Z-Feature Request U-Testing S-Moderate
For documentation purposes, educational purposes, etc., it may often make sense to put tests *after* a definition. However, at the moment this is impossible since we attach all tests to the following definition. 1. One alternative would be to forget the whole idea of attaching specific tests to specific functions. Maybe instead of something like ``` Running tests... even: OK odd: OK ``` we could instead print something like ``` Running tests... 12 tests passed! 2. Another alternative would be to create some kind of syntax for specifying which definition a test should be attached to. I kind of lean towards the first alternative at this point. I can't think of any particularly important benefits of attaching tests to definitions. And I can think of several situations in which the choice of which definition to attach a test to is arbitrary (for example, when expressing the property that two functions are inverses).
1.0
Ability to put tests after a definition - For documentation purposes, educational purposes, etc., it may often make sense to put tests *after* a definition. However, at the moment this is impossible since we attach all tests to the following definition. 1. One alternative would be to forget the whole idea of attaching specific tests to specific functions. Maybe instead of something like ``` Running tests... even: OK odd: OK ``` we could instead print something like ``` Running tests... 12 tests passed! 2. Another alternative would be to create some kind of syntax for specifying which definition a test should be attached to. I kind of lean towards the first alternative at this point. I can't think of any particularly important benefits of attaching tests to definitions. And I can think of several situations in which the choice of which definition to attach a test to is arbitrary (for example, when expressing the property that two functions are inverses).
non_process
ability to put tests after a definition for documentation purposes educational purposes etc it may often make sense to put tests after a definition however at the moment this is impossible since we attach all tests to the following definition one alternative would be to forget the whole idea of attaching specific tests to specific functions maybe instead of something like running tests even ok odd ok we could instead print something like running tests tests passed another alternative would be to create some kind of syntax for specifying which definition a test should be attached to i kind of lean towards the first alternative at this point i can t think of any particularly important benefits of attaching tests to definitions and i can think of several situations in which the choice of which definition to attach a test to is arbitrary for example when expressing the property that two functions are inverses
0
171,349
13,229,333,731
IssuesEvent
2020-08-18 07:59:30
microsoft/AzureStorageExplorer
https://api.github.com/repos/microsoft/AzureStorageExplorer
closed
The selected permission 'Delete version' doesn't display on Connect Summary dialog
:gear: attach :gear: sas 🧪 testing
**Storage Explorer Version:** 1.13.0 **Build**: [20200504.7](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3700661&view=results) **Branch**: master **Platform/OS:** Windows 10/ Linux Ubuntu 16.04/ macOS Catalina **Architecture**: ia32/x64 **Regression From:** Not a regression **Steps to reproduce:** 1. Select one non-ADLS Gen2 storage account. 2. Right-click the storage account -> Select 'Get Shared Access Signature...'. 3. Create a SAS connection string with permissions 'Read & List & Delete version' -> Copy the connection string. 4. Click 'Open Connect Dialog' -> Select 'using a connection string' -> Click 'Next'. 5. Paste the copied connection string -> Click 'Next'. 6. Check the permissions value on Connection Summary dialog. **Expect Experience:** The permission 'Delete version' displays. **Actual Experience:** 1. No permission 'Delete version' displays. 2. In Connection string: sp=rxl. ![image](https://user-images.githubusercontent.com/41351993/81161795-ba471400-8fbe-11ea-954a-9d9bd0d6be1a.png)
1.0
The selected permission 'Delete version' doesn't display on Connect Summary dialog - **Storage Explorer Version:** 1.13.0 **Build**: [20200504.7](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3700661&view=results) **Branch**: master **Platform/OS:** Windows 10/ Linux Ubuntu 16.04/ macOS Catalina **Architecture**: ia32/x64 **Regression From:** Not a regression **Steps to reproduce:** 1. Select one non-ADLS Gen2 storage account. 2. Right-click the storage account -> Select 'Get Shared Access Signature...'. 3. Create a SAS connection string with permissions 'Read & List & Delete version' -> Copy the connection string. 4. Click 'Open Connect Dialog' -> Select 'using a connection string' -> Click 'Next'. 5. Paste the copied connection string -> Click 'Next'. 6. Check the permissions value on Connection Summary dialog. **Expect Experience:** The permission 'Delete version' displays. **Actual Experience:** 1. No permission 'Delete version' displays. 2. In Connection string: sp=rxl. ![image](https://user-images.githubusercontent.com/41351993/81161795-ba471400-8fbe-11ea-954a-9d9bd0d6be1a.png)
non_process
the selected permission delete version doesn t display on connect summary dialog storage explorer version build branch master platform os windows linux ubuntu macos catalina architecture regression from not a regression steps to reproduce select one non adls storage account right click the storage account select get shared access signature create a sas connection string with permissions read list delete version copy the connection string click open connect dialog select using a connection string click next paste the copied connection string click next check the permissions value on connection summary dialog expect experience the permission delete version displays actual experience no permission delete version displays in connection string sp rxl
0
16,355
5,233,695,749
IssuesEvent
2017-01-30 13:43:42
SemsTestOrg/combinearchive-web
https://api.github.com/repos/SemsTestOrg/combinearchive-web
closed
Check for empty archive names, when creating
code fixed major migrated task
## Trac Ticket #54 **component:** code **owner:** somebody **reporter:** anonymous **created:** 2014-08-21 14:15:37 **milestone:** **type:** task **version:** **keywords:** ## comment 1 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> In [None](/04d5dd0c2c002cc21f304ed828582fe5f71e2ec7): ```CommitTicketReference repository="" revision="04d5dd0c2c002cc21f304ed828582fe5f71e2ec7" reset fields after successfull create and show loading indicator while creating an archive [fixes #60][fixes #61][fixes #54] ``` ## comment 2 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> Updated **resolution** to **fixed** ## comment 3 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> Updated **status** to **closed** ## comment 4 **time:** 2014-08-25 16:34:50 **author:** mp487 <martin.peters3@uni-rostock.de> In [None](/04d5dd0c2c002cc21f304ed828582fe5f71e2ec7): ```CommitTicketReference repository="" revision="04d5dd0c2c002cc21f304ed828582fe5f71e2ec7" reset fields after successfull create and show loading indicator while creating an archive [fixes #60][fixes #61][fixes #54] ```
1.0
Check for empty archive names, when creating - ## Trac Ticket #54 **component:** code **owner:** somebody **reporter:** anonymous **created:** 2014-08-21 14:15:37 **milestone:** **type:** task **version:** **keywords:** ## comment 1 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> In [None](/04d5dd0c2c002cc21f304ed828582fe5f71e2ec7): ```CommitTicketReference repository="" revision="04d5dd0c2c002cc21f304ed828582fe5f71e2ec7" reset fields after successfull create and show loading indicator while creating an archive [fixes #60][fixes #61][fixes #54] ``` ## comment 2 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> Updated **resolution** to **fixed** ## comment 3 **time:** 2014-08-25 16:23:06 **author:** mp487 <martin.peters3@uni-rostock.de> Updated **status** to **closed** ## comment 4 **time:** 2014-08-25 16:34:50 **author:** mp487 <martin.peters3@uni-rostock.de> In [None](/04d5dd0c2c002cc21f304ed828582fe5f71e2ec7): ```CommitTicketReference repository="" revision="04d5dd0c2c002cc21f304ed828582fe5f71e2ec7" reset fields after successfull create and show loading indicator while creating an archive [fixes #60][fixes #61][fixes #54] ```
non_process
check for empty archive names when creating trac ticket component code owner somebody reporter anonymous created milestone type task version keywords comment time author in committicketreference repository revision reset fields after successfull create and show loading indicator while creating an archive comment time author updated resolution to fixed comment time author updated status to closed comment time author in committicketreference repository revision reset fields after successfull create and show loading indicator while creating an archive
0
17,294
23,108,490,445
IssuesEvent
2022-07-27 10:55:28
FTBTeam/FTB-App
https://api.github.com/repos/FTBTeam/FTB-App
closed
[Bug]: Cant download most packs and cant run the few I can
bug subprocess
### What Operating System Windows 10 ### App Version 202204211116-ac2e189d70-release ### UI Version ac2e189d70 ### Log Files https://pste.ch/fagacahoro ### Debug Code FTB-DBGSUWEJUCIPO ### Describe the bug Cant download most packs ### Steps to reproduce 1. Open app ( still doesn't work with admin access) 2. Try to download modpack ( mosty oceanblock) 3. so far only stoneblock 2 will download but cant run it ### Expected behaviour I don't have requisite programs or apps? ### Screenshots ![help](https://user-images.githubusercontent.com/104655893/166092775-8e241156-b091-4600-a2f6-3cb9ea8bf310.png) ### Additional information _No response_ ### Information - [X] I have provided as much information as possible
1.0
[Bug]: Cant download most packs and cant run the few I can - ### What Operating System Windows 10 ### App Version 202204211116-ac2e189d70-release ### UI Version ac2e189d70 ### Log Files https://pste.ch/fagacahoro ### Debug Code FTB-DBGSUWEJUCIPO ### Describe the bug Cant download most packs ### Steps to reproduce 1. Open app ( still doesn't work with admin access) 2. Try to download modpack ( mosty oceanblock) 3. so far only stoneblock 2 will download but cant run it ### Expected behaviour I don't have requisite programs or apps? ### Screenshots ![help](https://user-images.githubusercontent.com/104655893/166092775-8e241156-b091-4600-a2f6-3cb9ea8bf310.png) ### Additional information _No response_ ### Information - [X] I have provided as much information as possible
process
cant download most packs and cant run the few i can what operating system windows app version release ui version log files debug code ftb dbgsuwejucipo describe the bug cant download most packs steps to reproduce open app still doesn t work with admin access try to download modpack mosty oceanblock so far only stoneblock will download but cant run it expected behaviour i don t have requisite programs or apps screenshots additional information no response information i have provided as much information as possible
1
9,362
12,370,928,769
IssuesEvent
2020-05-18 17:39:20
ofcyln/mortgage-expense-calculator
https://api.github.com/repos/ofcyln/mortgage-expense-calculator
closed
Modify README.md as a guide/documentation of the app
2 High app creation process
- Social sharable background implementation - App screenshot integrations - Explanatory content creation - Git badge implementations with subscriptions of the CI/CD services
1.0
Modify README.md as a guide/documentation of the app - - Social sharable background implementation - App screenshot integrations - Explanatory content creation - Git badge implementations with subscriptions of the CI/CD services
process
modify readme md as a guide documentation of the app social sharable background implementation app screenshot integrations explanatory content creation git badge implementations with subscriptions of the ci cd services
1
154,000
24,228,475,677
IssuesEvent
2022-09-26 16:08:17
o3de/o3de
https://api.github.com/repos/o3de/o3de
closed
AssetProcessor UI "Event Log Details" Status icons should be the same as "Asset Status" Status icons
needs-triage sig/content sig/ui-ux needs-ux-design
**Is your feature request related to a problem? Please describe.** I find the current "Status" icons in "Event Log Details" Log View to be confusing. In particular the Warning icon looks like an error icon. The Warning Icon in the "Asset Status" list looks correct, it even has a triangular shape while the errors are circular and red. **Describe the solution you'd like** The Status icons in the "Event Log Details" Log View should be the same as the "Asset Status" Log View. **Describe alternatives you've considered** None. **Additional context** Here is a screenshot with the problem: ![image](https://user-images.githubusercontent.com/66021303/138761724-1f544d95-1b63-40a8-a0b3-095045eb5dbd.png)
1.0
AssetProcessor UI "Event Log Details" Status icons should be the same as "Asset Status" Status icons - **Is your feature request related to a problem? Please describe.** I find the current "Status" icons in "Event Log Details" Log View to be confusing. In particular the Warning icon looks like an error icon. The Warning Icon in the "Asset Status" list looks correct, it even has a triangular shape while the errors are circular and red. **Describe the solution you'd like** The Status icons in the "Event Log Details" Log View should be the same as the "Asset Status" Log View. **Describe alternatives you've considered** None. **Additional context** Here is a screenshot with the problem: ![image](https://user-images.githubusercontent.com/66021303/138761724-1f544d95-1b63-40a8-a0b3-095045eb5dbd.png)
non_process
assetprocessor ui event log details status icons should be the same as asset status status icons is your feature request related to a problem please describe i find the current status icons in event log details log view to be confusing in particular the warning icon looks like an error icon the warning icon in the asset status list looks correct it even has a triangular shape while the errors are circular and red describe the solution you d like the status icons in the event log details log view should be the same as the asset status log view describe alternatives you ve considered none additional context here is a screenshot with the problem
0
142,458
21,769,280,056
IssuesEvent
2022-05-13 07:23:48
status-im/status-desktop
https://api.github.com/repos/status-im/status-desktop
closed
Greying out already imported/derived addresses in address list and also choosing next available as default one
Wallet feature wallet-redesign
greying out already imported/derived addresses in address list and also choosing next available as default one
1.0
Greying out already imported/derived addresses in address list and also choosing next available as default one - greying out already imported/derived addresses in address list and also choosing next available as default one
non_process
greying out already imported derived addresses in address list and also choosing next available as default one greying out already imported derived addresses in address list and also choosing next available as default one
0
26,781
4,786,788,439
IssuesEvent
2016-10-29 16:40:20
WinFF/winff
https://api.github.com/repos/WinFF/winff
closed
Wrong Polish translation of Time: seconds
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. Launch program in Polsih language. 2. Go to Options > Time (Polish "Time" - the 5th option) 3. You see: "Godziny/Minuty/Godziny" which meens: "Hours/Minutes/Hours". What is the expected output? What do you see instead? You should replace on of the "Godziny" by "Sekundy" (Polish name for "Seconds"). What version of the product are you using? On what operating system? Ubuntu 14.04 PL Please provide any additional information below. Very low priority. ``` Original issue reported on code.google.com by `klap...@gmail.com` on 2 Aug 2014 at 7:29 Attachments: - [WinFF_PL_bug_time.jpeg](https://storage.googleapis.com/google-code-attachments/winff/issue-219/comment-0/WinFF_PL_bug_time.jpeg)
1.0
Wrong Polish translation of Time: seconds - ``` What steps will reproduce the problem? 1. Launch program in Polsih language. 2. Go to Options > Time (Polish "Time" - the 5th option) 3. You see: "Godziny/Minuty/Godziny" which meens: "Hours/Minutes/Hours". What is the expected output? What do you see instead? You should replace on of the "Godziny" by "Sekundy" (Polish name for "Seconds"). What version of the product are you using? On what operating system? Ubuntu 14.04 PL Please provide any additional information below. Very low priority. ``` Original issue reported on code.google.com by `klap...@gmail.com` on 2 Aug 2014 at 7:29 Attachments: - [WinFF_PL_bug_time.jpeg](https://storage.googleapis.com/google-code-attachments/winff/issue-219/comment-0/WinFF_PL_bug_time.jpeg)
non_process
wrong polish translation of time seconds what steps will reproduce the problem launch program in polsih language go to options time polish time the option you see godziny minuty godziny which meens hours minutes hours what is the expected output what do you see instead you should replace on of the godziny by sekundy polish name for seconds what version of the product are you using on what operating system ubuntu pl please provide any additional information below very low priority original issue reported on code google com by klap gmail com on aug at attachments
0
12,267
19,547,531,117
IssuesEvent
2022-01-02 05:46:30
wplug/MemberPortal
https://api.github.com/repos/wplug/MemberPortal
closed
Define a strategy for authentication and authorization
wontfix requirement
If we implement the portal as a standalone system, we will need to decide how to do authentication (both in terms of how we will manage authentication data on the back-end, and how the UI will deal with it), and authorization (how we will keep track of who is an admin and who is not). My immediate suggestion would be to deploy an LDAP directory, and to define a group in LDAP to list who is an admin. What we do here is dependent on the outcome of Issue #4.
1.0
Define a strategy for authentication and authorization - If we implement the portal as a standalone system, we will need to decide how to do authentication (both in terms of how we will manage authentication data on the back-end, and how the UI will deal with it), and authorization (how we will keep track of who is an admin and who is not). My immediate suggestion would be to deploy an LDAP directory, and to define a group in LDAP to list who is an admin. What we do here is dependent on the outcome of Issue #4.
non_process
define a strategy for authentication and authorization if we implement the portal as a standalone system we will need to decide how to do authentication both in terms of how we will manage authentication data on the back end and how the ui will deal with it and authorization how we will keep track of who is an admin and who is not my immediate suggestion would be to deploy an ldap directory and to define a group in ldap to list who is an admin what we do here is dependent on the outcome of issue
0
13,308
15,781,365,348
IssuesEvent
2021-04-01 11:16:06
GoogleCloudPlatform/dotnet-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/dotnet-docs-samples
closed
[Translate]: Fix and reactivate some tests that have started failing with PERMISSION_DENIED
api: translation priority: p1 samples type: process
Failing tests are: - BatchTranslateTests.GoogleCloudSamples.BatchTranslateTests.BatchTranslateTextTest - BatchTranslateWithGlossaryAndModelTests.GoogleCloudSamples.BatchTranslateWithGlossaryAndModelTests.BatchTranslateTextWithGlossaryAndModelTest - BatchTranslateWithGlossaryTests.GoogleCloudSamples.BatchTranslateWithGlossaryTests.BatchTranslateTextWithGlossaryTest - BatchTranslateWithModelTests.GoogleCloudSamples.BatchTranslateWithModelTests.BatchTranslateTextWithModelTest
1.0
[Translate]: Fix and reactivate some tests that have started failing with PERMISSION_DENIED - Failing tests are: - BatchTranslateTests.GoogleCloudSamples.BatchTranslateTests.BatchTranslateTextTest - BatchTranslateWithGlossaryAndModelTests.GoogleCloudSamples.BatchTranslateWithGlossaryAndModelTests.BatchTranslateTextWithGlossaryAndModelTest - BatchTranslateWithGlossaryTests.GoogleCloudSamples.BatchTranslateWithGlossaryTests.BatchTranslateTextWithGlossaryTest - BatchTranslateWithModelTests.GoogleCloudSamples.BatchTranslateWithModelTests.BatchTranslateTextWithModelTest
process
fix and reactivate some tests that have started failing with permission denied failing tests are batchtranslatetests googlecloudsamples batchtranslatetests batchtranslatetexttest batchtranslatewithglossaryandmodeltests googlecloudsamples batchtranslatewithglossaryandmodeltests batchtranslatetextwithglossaryandmodeltest batchtranslatewithglossarytests googlecloudsamples batchtranslatewithglossarytests batchtranslatetextwithglossarytest batchtranslatewithmodeltests googlecloudsamples batchtranslatewithmodeltests batchtranslatetextwithmodeltest
1
4,582
3,399,268,471
IssuesEvent
2015-12-02 10:03:49
akka/akka
https://api.github.com/repos/akka/akka
closed
Comply with new sbt-dependency-graph(0.8.0)
1 - triaged community-contrib t:build
Hi, I have in my .sbt/0.13/plugins/plugins.sbt a fresh version of sbt-dependency-graph that is picked up by sbt on akka project. Unfortunately akka relies on older version(0.7.5), we could align to the newest one. These changes introduced issue: https://github.com/jrudolph/sbt-dependency-graph/commit/7a5126ef875ee24f364c7b0be591893fd5d80d4a
1.0
Comply with new sbt-dependency-graph(0.8.0) - Hi, I have in my .sbt/0.13/plugins/plugins.sbt a fresh version of sbt-dependency-graph that is picked up by sbt on akka project. Unfortunately akka relies on older version(0.7.5), we could align to the newest one. These changes introduced issue: https://github.com/jrudolph/sbt-dependency-graph/commit/7a5126ef875ee24f364c7b0be591893fd5d80d4a
non_process
comply with new sbt dependency graph hi i have in my sbt plugins plugins sbt a fresh version of sbt dependency graph that is picked up by sbt on akka project unfortunately akka relies on older version we could align to the newest one these changes introduced issue
0
10,187
8,407,863,190
IssuesEvent
2018-10-11 22:29:37
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
opened
Test Windows installer for brave-core; ensure silent install possible
infrastructure setup/installer
We'll want to try (via CLI) the stub installer on Windows to ensure we can do a silent install. We can capture the exact shell command ran here too If (for some reason) we run into problems doing that, we can ask folks who may be familiar with the installer (ex: @simonhong or @emerick) for help 😄
1.0
Test Windows installer for brave-core; ensure silent install possible - We'll want to try (via CLI) the stub installer on Windows to ensure we can do a silent install. We can capture the exact shell command ran here too If (for some reason) we run into problems doing that, we can ask folks who may be familiar with the installer (ex: @simonhong or @emerick) for help 😄
non_process
test windows installer for brave core ensure silent install possible we ll want to try via cli the stub installer on windows to ensure we can do a silent install we can capture the exact shell command ran here too if for some reason we run into problems doing that we can ask folks who may be familiar with the installer ex simonhong or emerick for help 😄
0
236,586
7,751,029,826
IssuesEvent
2018-05-30 15:50:14
mozilla/addons-server
https://api.github.com/repos/mozilla/addons-server
closed
UI issues for cyrillic languages
component: devhub contrib: good first bug contrib: mentor assigned priority: p4 triaged
Steps to reproduce: 1.Go to https://addons-dev.allizom.org/bg/developers/addons (Same for Russian) 2.Observe the header Expected results: No UI issues in Cyrillic languages Actual results: There are overlaying issues for Cyrillic languages Notes: I know that there are small overlaying issues for other non Cyrillic languages and we agreed that these will not be fixed, but in this case the UI issues are very obvious Please see screenshot for this issue : ![cirillyc](https://cloud.githubusercontent.com/assets/1583842/25237353/1107174c-25f3-11e7-8dad-08fbe7941d53.png)
1.0
UI issues for cyrillic languages - Steps to reproduce: 1.Go to https://addons-dev.allizom.org/bg/developers/addons (Same for Russian) 2.Observe the header Expected results: No UI issues in Cyrillic languages Actual results: There are overlaying issues for Cyrillic languages Notes: I know that there are small overlaying issues for other non Cyrillic languages and we agreed that these will not be fixed, but in this case the UI issues are very obvious Please see screenshot for this issue : ![cirillyc](https://cloud.githubusercontent.com/assets/1583842/25237353/1107174c-25f3-11e7-8dad-08fbe7941d53.png)
non_process
ui issues for cyrillic languages steps to reproduce go to same for russian observe the header expected results no ui issues in cyrillic languages actual results there are overlaying issues for cyrillic languages notes i know that there are small overlaying issues for other non cyrillic languages and we agreed that these will not be fixed but in this case the ui issues are very obvious please see screenshot for this issue
0
6,254
9,215,591,305
IssuesEvent
2019-03-11 04:01:20
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
reopened
Desktop: System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute fails occasionally on NETFX
area-System.Diagnostics.Process disabled-test test bug test-run-desktop
Failed test: System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute (from System.Diagnostics.Process.Tests) Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_netfx_windows_nt_debug/67/testReport/System.Diagnostics.Tests/ProcessStartInfoTests/StartInfo_TextFile_ShellExecute/ Configuration: outerloop_netfx_windows_nt_debug MESSAGE: ~~~ System.NullReferenceException : Object reference not set to an instance of an object. ~~~ STACK TRACE: ~~~ at System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute() in D:\j\workspace\outerloop_net---903ddde6\src\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs:line 1013 ~~~
1.0
Desktop: System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute fails occasionally on NETFX - Failed test: System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute (from System.Diagnostics.Process.Tests) Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_netfx_windows_nt_debug/67/testReport/System.Diagnostics.Tests/ProcessStartInfoTests/StartInfo_TextFile_ShellExecute/ Configuration: outerloop_netfx_windows_nt_debug MESSAGE: ~~~ System.NullReferenceException : Object reference not set to an instance of an object. ~~~ STACK TRACE: ~~~ at System.Diagnostics.Tests.ProcessStartInfoTests.StartInfo_TextFile_ShellExecute() in D:\j\workspace\outerloop_net---903ddde6\src\System.Diagnostics.Process\tests\ProcessStartInfoTests.cs:line 1013 ~~~
process
desktop system diagnostics tests processstartinfotests startinfo textfile shellexecute fails occasionally on netfx failed test system diagnostics tests processstartinfotests startinfo textfile shellexecute from system diagnostics process tests detail configuration outerloop netfx windows nt debug message system nullreferenceexception object reference not set to an instance of an object stack trace at system diagnostics tests processstartinfotests startinfo textfile shellexecute in d j workspace outerloop net src system diagnostics process tests processstartinfotests cs line
1
13,182
15,610,491,451
IssuesEvent
2021-03-19 13:15:38
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
GRASS r.cost does not generate results on Windows
Bug Feedback Processing
**Describe the bug** I try to generate a cost surface from raster data with GRASS algorithm r.cost and it seems to run fine, but does not generate any outputs. Seems to be Windows specific problem. **How to Reproduce** 1. Open r.cost algorithm 2. Use sample tiff as input. [hki_cost_rast.zip](https://github.com/qgis/QGIS/files/6170878/hki_cost_rast.zip) 3. Click on two points on the map as start and end points 4. Leave everything else as defaul 5. Run 6. No results are generated **QGIS and OS versions** The process works fine on Ubunut, but on Windows it fails. ``` QGIS version 3.16.2-Hannover QGIS code revision f1660f9da5 Compiled against Qt 5.11.2 Running against Qt 5.11.2 Compiled against GDAL/OGR 3.1.4 Running against GDAL/OGR 3.1.4 Compiled against GEOS 3.8.1-CAPI-1.13.3 Running against GEOS 3.8.1-CAPI-1.13.3 Compiled against SQLite 3.29.0 Running against SQLite 3.29.0 PostgreSQL Client Version 11.5 SpatiaLite Version 4.3.0 QWT Version 6.1.3 QScintilla2 Version 2.10.8 Compiled against PROJ 6.3.2 Running against PROJ Rel. 6.3.2, May 1st, 2020 OS Version Windows 10 (10.0) Active python plugins db_manager; MetaSearch; processing ``` **Additional context** Full log output of the failed process ``` `QGIS version: 3.16.2-Hannover QGIS code revision: f1660f9da5 Qt version: 5.11.2 GDAL version: 3.1.4 GEOS version: 3.8.1-CAPI-1.13.3 PROJ version: Rel. 6.3.2, May 1st, 2020 Processing algorithm… Algorithm 'r.cost' starting… Input parameters: { '-k' : False, '-n' : True, 'GRASS_MIN_AREA_PARAMETER' : 0.0001, 'GRASS_RASTER_FORMAT_META' : '', 'GRASS_RASTER_FORMAT_OPT' : '', 'GRASS_REGION_CELLSIZE_PARAMETER' : 0, 'GRASS_REGION_PARAMETER' : None, 'GRASS_SNAP_TOLERANCE_PARAMETER' : -1, 'input' : 'D:/temp_data/rex/hki_cost_rast.tif', 'max_cost' : 0, 'memory' : 300, 'nearest' : 'TEMPORARY_OUTPUT', 'null_cost' : None, 'outdir' : 'TEMPORARY_OUTPUT', 'output' : 'TEMPORARY_OUTPUT', 'start_coordinates' : '25494383.665503,6679851.063884 [EPSG:3879]', 'start_points' : None, 'start_raster' : None, 'stop_coordinates' : '25497158.751327,6676846.842534 [EPSG:3879]', 'stop_points' : None } g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" r.in.gdal input="D:\temp_data\rex\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0 r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite g.region raster=output44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="output44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\3917d32d671541268802adc05fd54440\output.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite g.region raster=nearest44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="nearest44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\37c0bfd0f2fb40c5b313a7749b7b4d96\nearest.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite g.region raster=outdir44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="outdir44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\46d00ccbdcab40e19d1a80cfcde3e2ad\outdir.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite Starting GRASS GIS... WARNING: Concurrent mapset locking is not supported on Windows Cleaning up temporary files... Executing <C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\grassdata\grass_batch_job.cmd> ... C:\Program Files\QGIS 3.16\bin>chcp 1252 1>NUL C:\Program Files\QGIS 3.16\bin>g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" C:\Program Files\QGIS 3.16\bin>r.in.gdal input="D:\temp_data\rex\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o C:\Program Files\QGIS 3.16\bin>g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0 C:\Program Files\QGIS 3.16\bin>r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=output44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="output44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\3917d32d671541268802adc05fd54440\output.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=nearest44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="nearest44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\37c0bfd0f2fb40c5b313a7749b7b4d96\nearest.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=outdir44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="outdir44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\46d00ccbdcab40e19d1a80cfcde3e2ad\outdir.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>exit Execution of <C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\grassdata\grass_batch_job.cmd> finished. Cleaning up temporary files... Execution completed in 0.81 seconds Results: {'nearest': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\37c0bfd0f2fb40c5b313a7749b7b4d96\\nearest.tif', 'outdir': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\46d00ccbdcab40e19d1a80cfcde3e2ad\\outdir.tif', 'output': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\3917d32d671541268802adc05fd54440\\output.tif'} Loading resulting layers The following layers were not correctly generated. • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/37c0bfd0f2fb40c5b313a7749b7b4d96/nearest.tif • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/3917d32d671541268802adc05fd54440/output.tif • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/46d00ccbdcab40e19d1a80cfcde3e2ad/outdir.tif You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.` ``` And the Log messages only contains this: ``` `2021-03-19T13:57:06 INFO processInputs end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605491929cd662" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0'] 2021-03-19T13:57:06 INFO processCommands end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605491929cd662" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0', 'r.cost input=rast_605491929cd662 start_coordinates=25494307.286994,6676515.868995 stop_coordinates=25498609.942996,6678170.736688 -n max_cost=0 memory=300 output=outputbe3648db66344258aed1b27c1d43c713 nearest=nearestbe3648db66344258aed1b27c1d43c713 outdir=outdirbe3648db66344258aed1b27c1d43c713 --overwrite'] 2021-03-19T14:03:48 INFO processInputs end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0'] 2021-03-19T14:03:48 INFO processCommands end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0', 'r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite']` ```
1.0
GRASS r.cost does not generate results on Windows - **Describe the bug** I try to generate a cost surface from raster data with GRASS algorithm r.cost and it seems to run fine, but does not generate any outputs. Seems to be Windows specific problem. **How to Reproduce** 1. Open r.cost algorithm 2. Use sample tiff as input. [hki_cost_rast.zip](https://github.com/qgis/QGIS/files/6170878/hki_cost_rast.zip) 3. Click on two points on the map as start and end points 4. Leave everything else as defaul 5. Run 6. No results are generated **QGIS and OS versions** The process works fine on Ubunut, but on Windows it fails. ``` QGIS version 3.16.2-Hannover QGIS code revision f1660f9da5 Compiled against Qt 5.11.2 Running against Qt 5.11.2 Compiled against GDAL/OGR 3.1.4 Running against GDAL/OGR 3.1.4 Compiled against GEOS 3.8.1-CAPI-1.13.3 Running against GEOS 3.8.1-CAPI-1.13.3 Compiled against SQLite 3.29.0 Running against SQLite 3.29.0 PostgreSQL Client Version 11.5 SpatiaLite Version 4.3.0 QWT Version 6.1.3 QScintilla2 Version 2.10.8 Compiled against PROJ 6.3.2 Running against PROJ Rel. 6.3.2, May 1st, 2020 OS Version Windows 10 (10.0) Active python plugins db_manager; MetaSearch; processing ``` **Additional context** Full log output of the failed process ``` `QGIS version: 3.16.2-Hannover QGIS code revision: f1660f9da5 Qt version: 5.11.2 GDAL version: 3.1.4 GEOS version: 3.8.1-CAPI-1.13.3 PROJ version: Rel. 6.3.2, May 1st, 2020 Processing algorithm… Algorithm 'r.cost' starting… Input parameters: { '-k' : False, '-n' : True, 'GRASS_MIN_AREA_PARAMETER' : 0.0001, 'GRASS_RASTER_FORMAT_META' : '', 'GRASS_RASTER_FORMAT_OPT' : '', 'GRASS_REGION_CELLSIZE_PARAMETER' : 0, 'GRASS_REGION_PARAMETER' : None, 'GRASS_SNAP_TOLERANCE_PARAMETER' : -1, 'input' : 'D:/temp_data/rex/hki_cost_rast.tif', 'max_cost' : 0, 'memory' : 300, 'nearest' : 'TEMPORARY_OUTPUT', 'null_cost' : None, 'outdir' : 'TEMPORARY_OUTPUT', 'output' : 'TEMPORARY_OUTPUT', 'start_coordinates' : '25494383.665503,6679851.063884 [EPSG:3879]', 'start_points' : None, 'start_raster' : None, 'stop_coordinates' : '25497158.751327,6676846.842534 [EPSG:3879]', 'stop_points' : None } g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" r.in.gdal input="D:\temp_data\rex\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0 r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite g.region raster=output44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="output44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\3917d32d671541268802adc05fd54440\output.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite g.region raster=nearest44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="nearest44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\37c0bfd0f2fb40c5b313a7749b7b4d96\nearest.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite g.region raster=outdir44032c6ed2144057a059ad4c1f41b379 r.out.gdal -t -m input="outdir44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\46d00ccbdcab40e19d1a80cfcde3e2ad\outdir.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite Starting GRASS GIS... WARNING: Concurrent mapset locking is not supported on Windows Cleaning up temporary files... Executing <C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\grassdata\grass_batch_job.cmd> ... C:\Program Files\QGIS 3.16\bin>chcp 1252 1>NUL C:\Program Files\QGIS 3.16\bin>g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs" C:\Program Files\QGIS 3.16\bin>r.in.gdal input="D:\temp_data\rex\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o C:\Program Files\QGIS 3.16\bin>g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0 C:\Program Files\QGIS 3.16\bin>r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=output44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="output44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\3917d32d671541268802adc05fd54440\output.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=nearest44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="nearest44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\37c0bfd0f2fb40c5b313a7749b7b4d96\nearest.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>g.region raster=outdir44032c6ed2144057a059ad4c1f41b379 C:\Program Files\QGIS 3.16\bin>r.out.gdal -t -m input="outdir44032c6ed2144057a059ad4c1f41b379" output="C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\46d00ccbdcab40e19d1a80cfcde3e2ad\outdir.tif" format="GTiff" createopt="TFW=YES,COMPRESS=LZW" --overwrite C:\Program Files\QGIS 3.16\bin>exit Execution of <C:\Users\Topi Tjukanov\AppData\Local\Temp\processing_WiztrF\grassdata\grass_batch_job.cmd> finished. Cleaning up temporary files... Execution completed in 0.81 seconds Results: {'nearest': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\37c0bfd0f2fb40c5b313a7749b7b4d96\\nearest.tif', 'outdir': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\46d00ccbdcab40e19d1a80cfcde3e2ad\\outdir.tif', 'output': 'C:\\Users\\Topi ' 'Tjukanov\\AppData\\Local\\Temp\\processing_WiztrF\\3917d32d671541268802adc05fd54440\\output.tif'} Loading resulting layers The following layers were not correctly generated. • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/37c0bfd0f2fb40c5b313a7749b7b4d96/nearest.tif • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/3917d32d671541268802adc05fd54440/output.tif • C:/Users/Topi Tjukanov/AppData/Local/Temp/processing_WiztrF/46d00ccbdcab40e19d1a80cfcde3e2ad/outdir.tif You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.` ``` And the Log messages only contains this: ``` `2021-03-19T13:57:06 INFO processInputs end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605491929cd662" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0'] 2021-03-19T13:57:06 INFO processCommands end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605491929cd662" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0', 'r.cost input=rast_605491929cd662 start_coordinates=25494307.286994,6676515.868995 stop_coordinates=25498609.942996,6678170.736688 -n max_cost=0 memory=300 output=outputbe3648db66344258aed1b27c1d43c713 nearest=nearestbe3648db66344258aed1b27c1d43c713 outdir=outdirbe3648db66344258aed1b27c1d43c713 --overwrite'] 2021-03-19T14:03:48 INFO processInputs end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0'] 2021-03-19T14:03:48 INFO processCommands end. Commands: ['g.proj -c proj4="+proj=tmerc +lat_0=0 +lon_0=25 +k=1 +x_0=25500000 +y_0=0 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs"', 'r.in.gdal input="D:\\temp_data\\rex\\hki_cost_rast.tif" band=1 output="rast_605493243a7203" --overwrite -o', 'g.region n=6687278.6229 s=6663000.6229 e=25514075.1434 w=25487917.1434 res=2.0', 'r.cost input=rast_605493243a7203 start_coordinates=25494383.665503,6679851.063884 stop_coordinates=25497158.751327,6676846.842534 -n max_cost=0 memory=300 output=output44032c6ed2144057a059ad4c1f41b379 nearest=nearest44032c6ed2144057a059ad4c1f41b379 outdir=outdir44032c6ed2144057a059ad4c1f41b379 --overwrite']` ```
process
grass r cost does not generate results on windows describe the bug i try to generate a cost surface from raster data with grass algorithm r cost and it seems to run fine but does not generate any outputs seems to be windows specific problem how to reproduce open r cost algorithm use sample tiff as input click on two points on the map as start and end points leave everything else as defaul run no results are generated qgis and os versions the process works fine on ubunut but on windows it fails qgis version hannover qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins db manager metasearch processing additional context full log output of the failed process qgis version hannover qgis code revision qt version gdal version geos version capi proj version rel may processing algorithm… algorithm r cost starting… input parameters k false n true grass min area parameter grass raster format meta grass raster format opt grass region cellsize parameter grass region parameter none grass snap tolerance parameter input d temp data rex hki cost rast tif max cost memory nearest temporary output null cost none outdir temporary output output temporary output start coordinates start points none start raster none stop coordinates stop points none g proj c proj tmerc lat lon k x y ellps units m no defs r in gdal input d temp data rex hki cost rast tif band output rast overwrite o g region n s e w res r cost input rast start coordinates stop coordinates n max cost memory output nearest outdir overwrite g region raster r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf output tif format gtiff createopt tfw yes compress lzw overwrite g region raster r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf nearest tif format gtiff createopt tfw yes compress lzw overwrite g region raster r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf outdir tif format gtiff createopt tfw yes compress lzw overwrite starting grass gis warning concurrent mapset locking is not supported on windows cleaning up temporary files executing c program files qgis bin chcp nul c program files qgis bin g proj c proj tmerc lat lon k x y ellps units m no defs c program files qgis bin r in gdal input d temp data rex hki cost rast tif band output rast overwrite o c program files qgis bin g region n s e w res c program files qgis bin r cost input rast start coordinates stop coordinates n max cost memory output nearest outdir overwrite c program files qgis bin g region raster c program files qgis bin r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf output tif format gtiff createopt tfw yes compress lzw overwrite c program files qgis bin g region raster c program files qgis bin r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf nearest tif format gtiff createopt tfw yes compress lzw overwrite c program files qgis bin g region raster c program files qgis bin r out gdal t m input output c users topi tjukanov appdata local temp processing wiztrf outdir tif format gtiff createopt tfw yes compress lzw overwrite c program files qgis bin exit execution of finished cleaning up temporary files execution completed in seconds results nearest c users topi tjukanov appdata local temp processing wiztrf nearest tif outdir c users topi tjukanov appdata local temp processing wiztrf outdir tif output c users topi tjukanov appdata local temp processing wiztrf output tif loading resulting layers the following layers were not correctly generated • c users topi tjukanov appdata local temp processing wiztrf nearest tif • c users topi tjukanov appdata local temp processing wiztrf output tif • c users topi tjukanov appdata local temp processing wiztrf outdir tif you can check the log messages panel in qgis main window to find more information about the execution of the algorithm and the log messages only contains this info processinputs end commands info processcommands end commands info processinputs end commands info processcommands end commands
1
6,251
9,211,249,975
IssuesEvent
2019-03-09 13:46:16
SerialLain3170/GAN-papers
https://api.github.com/repos/SerialLain3170/GAN-papers
opened
VideoFlow: A Flow-Based Generative Model for Video
Video Processing
# Paper [VideoFlow: A Flow-Based Generative Model for Video](https://arxiv.org/pdf/1903.01434.pdf) # Summary - Flowを動画生成に適用 - 潜在変数zの分布は自己回帰モデルによって前時刻における潜在変数を使用して求める <img width="426" alt="スクリーンショット 2019-03-09 22 41 17" src="https://user-images.githubusercontent.com/32360147/54072316-986bd780-42bc-11e9-8196-7ba35b5e02a9.png"> # Date 2019/03/04
1.0
VideoFlow: A Flow-Based Generative Model for Video - # Paper [VideoFlow: A Flow-Based Generative Model for Video](https://arxiv.org/pdf/1903.01434.pdf) # Summary - Flowを動画生成に適用 - 潜在変数zの分布は自己回帰モデルによって前時刻における潜在変数を使用して求める <img width="426" alt="スクリーンショット 2019-03-09 22 41 17" src="https://user-images.githubusercontent.com/32360147/54072316-986bd780-42bc-11e9-8196-7ba35b5e02a9.png"> # Date 2019/03/04
process
videoflow a flow based generative model for video paper summary flowを動画生成に適用 潜在変数zの分布は自己回帰モデルによって前時刻における潜在変数を使用して求める img width alt スクリーンショット src date
1
196
2,602,718,522
IssuesEvent
2015-02-24 11:07:49
symfony/symfony
https://api.github.com/repos/symfony/symfony
closed
[Process] Run a process within a webrequest
Process
I have a browser triggered php-script which is ment to start a process. The `Process` class then wants to use `/dev/tty` which on my Ubuntu 14 LTS is only readable as root per default. that means no matter which process I start it runs into a Warning and finally dies: ``` Warning: proc_open(/dev/tty): failed to open stream: No such device or address in XXX/vendor/symfony/process/Symfony/Component/Process/Process.php on line 291 ``` Should `setTty` also throw exceptions when `/dev/tty` is not readable?
1.0
[Process] Run a process within a webrequest - I have a browser triggered php-script which is ment to start a process. The `Process` class then wants to use `/dev/tty` which on my Ubuntu 14 LTS is only readable as root per default. that means no matter which process I start it runs into a Warning and finally dies: ``` Warning: proc_open(/dev/tty): failed to open stream: No such device or address in XXX/vendor/symfony/process/Symfony/Component/Process/Process.php on line 291 ``` Should `setTty` also throw exceptions when `/dev/tty` is not readable?
process
run a process within a webrequest i have a browser triggered php script which is ment to start a process the process class then wants to use dev tty which on my ubuntu lts is only readable as root per default that means no matter which process i start it runs into a warning and finally dies warning proc open dev tty failed to open stream no such device or address in xxx vendor symfony process symfony component process process php on line should settty also throw exceptions when dev tty is not readable
1
828
3,296,528,212
IssuesEvent
2015-11-01 22:28:07
t3kt/vjzual2
https://api.github.com/repos/t3kt/vjzual2
closed
background module
enhancement video processing
put either a constant color or a video source under the input video. related to #151, which is for a constant color generator, which could be used as input to this module.
1.0
background module - put either a constant color or a video source under the input video. related to #151, which is for a constant color generator, which could be used as input to this module.
process
background module put either a constant color or a video source under the input video related to which is for a constant color generator which could be used as input to this module
1
444,073
12,806,049,924
IssuesEvent
2020-07-03 08:43:54
RichardFav/AnalysisGUI
https://api.github.com/repos/RichardFav/AnalysisGUI
closed
combining outputs for Phase Spiking Rate Comparison (Whole Experiment)
MEDIUM priority Review Required enhancement
Add option to show combined "excited+inhibited+mixed" percentages plus error bars.
1.0
combining outputs for Phase Spiking Rate Comparison (Whole Experiment) - Add option to show combined "excited+inhibited+mixed" percentages plus error bars.
non_process
combining outputs for phase spiking rate comparison whole experiment add option to show combined excited inhibited mixed percentages plus error bars
0
19,660
26,020,361,426
IssuesEvent
2022-12-21 12:06:36
firebase/firebase-cpp-sdk
https://api.github.com/repos/firebase/firebase-cpp-sdk
reopened
[C++] Nightly Integration Testing Report for Firestore
type: process nightly-testing
<hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @sunmou99 on commit 3c64b33c0c3db4b476757c9e9041e7680f7e35c6 Last updated: Tue Dec 20 03:45 PST 2022 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3739437028)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against SDK] Integration test succeeded! Requested by @firebase-workflow-trigger[bot] on commit 3c64b33c0c3db4b476757c9e9041e7680f7e35c6 Last updated: Tue Dec 20 05:59 PST 2022 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3740512821)**
1.0
[C++] Nightly Integration Testing Report for Firestore - <hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @sunmou99 on commit 3c64b33c0c3db4b476757c9e9041e7680f7e35c6 Last updated: Tue Dec 20 03:45 PST 2022 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3739437028)** <hidden value="integration-test-status-comment"></hidden> *** ### ✅&nbsp; [build against SDK] Integration test succeeded! Requested by @firebase-workflow-trigger[bot] on commit 3c64b33c0c3db4b476757c9e9041e7680f7e35c6 Last updated: Tue Dec 20 05:59 PST 2022 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3740512821)**
process
nightly integration testing report for firestore ✅ nbsp integration test succeeded requested by on commit last updated tue dec pst ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated tue dec pst
1