Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
39,360
5,076,058,684
IssuesEvent
2016-12-27 23:25:47
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Run the selector on items during Select.Count chains.
api-needs-work area-System.Linq Design Discussion up for grabs
Per @VSadov's comment at https://github.com/dotnet/corefx/pull/12703#issuecomment-262051319: > Re: can we skip selectors/predicates in this optimization > > It is a tough one. > > Generally, assuming that anything can have sideeffects is indeed very > constraining to what kind of optimizations we can do. In some cases it > is hard to specify how much of the user code runs and in what order. > We definitely reserve the rights to substitute iteration for indexing > when we find that possible. > > After thinking about this for quite a while, I think Count is a kind > of aggregating operator. Even though the actual results are not > collected, it seems reasonable for the user to expect that they would > be computed, and as such he may expect to observe the sideeffects from > selectors/predicates etc. > > I.E. – I could see someone using a selector that writes to the > console, and use Count as a way to run the query for sideeffects only. > > I think we should not take the change and check for other changes like > this, that we might have accepted in the past. (like: #11841 ) > > Here is what I think is the root cause - We do have a method for > obtaining counts internally - > > GetCount(bool onlyIfCheap) > > I think the method should not be used to bypass selectors when > actually running the query. It is ok to use it to preallocate internal > buffers. That still have an assumption that the “cheap” way of > obtaining the underlying “.Count” is idempotent, but it is a kind of > assumption that we agreed in the past to be acceptable. > > It does not seem to be acceptable to assume that user-supplied > selectors/predicates are sideeffects-free in a context of aggregating > query. > > I.E. – > > it would be ok to preallocate a buffer based on GetCount(bool > onlyIfCheap) it would be ok to compute actual Count via , but only as > long as there are no funcs to run if we have selectors/predicates or > other funcs, they must run in aggregating queries. So in other words, we should undo all the `GetCount` optimizations that skip running the selector if `onlyIfCheap` if false. But if it's true, we're probably using it to preallocate a buffer of some sort, & are probably going to use it anyways. So in those scenarios it should be ok to not run the selector. cc @JonHanna
1.0
Run the selector on items during Select.Count chains. - Per @VSadov's comment at https://github.com/dotnet/corefx/pull/12703#issuecomment-262051319: > Re: can we skip selectors/predicates in this optimization > > It is a tough one. > > Generally, assuming that anything can have sideeffects is indeed very > constraining to what kind of optimizations we can do. In some cases it > is hard to specify how much of the user code runs and in what order. > We definitely reserve the rights to substitute iteration for indexing > when we find that possible. > > After thinking about this for quite a while, I think Count is a kind > of aggregating operator. Even though the actual results are not > collected, it seems reasonable for the user to expect that they would > be computed, and as such he may expect to observe the sideeffects from > selectors/predicates etc. > > I.E. – I could see someone using a selector that writes to the > console, and use Count as a way to run the query for sideeffects only. > > I think we should not take the change and check for other changes like > this, that we might have accepted in the past. (like: #11841 ) > > Here is what I think is the root cause - We do have a method for > obtaining counts internally - > > GetCount(bool onlyIfCheap) > > I think the method should not be used to bypass selectors when > actually running the query. It is ok to use it to preallocate internal > buffers. That still have an assumption that the “cheap” way of > obtaining the underlying “.Count” is idempotent, but it is a kind of > assumption that we agreed in the past to be acceptable. > > It does not seem to be acceptable to assume that user-supplied > selectors/predicates are sideeffects-free in a context of aggregating > query. > > I.E. – > > it would be ok to preallocate a buffer based on GetCount(bool > onlyIfCheap) it would be ok to compute actual Count via , but only as > long as there are no funcs to run if we have selectors/predicates or > other funcs, they must run in aggregating queries. So in other words, we should undo all the `GetCount` optimizations that skip running the selector if `onlyIfCheap` if false. But if it's true, we're probably using it to preallocate a buffer of some sort, & are probably going to use it anyways. So in those scenarios it should be ok to not run the selector. cc @JonHanna
non_process
run the selector on items during select count chains per vsadov s comment at re can we skip selectors predicates in this optimization it is a tough one generally assuming that anything can have sideeffects is indeed very constraining to what kind of optimizations we can do in some cases it is hard to specify how much of the user code runs and in what order we definitely reserve the rights to substitute iteration for indexing when we find that possible after thinking about this for quite a while i think count is a kind of aggregating operator even though the actual results are not collected it seems reasonable for the user to expect that they would be computed and as such he may expect to observe the sideeffects from selectors predicates etc i e – i could see someone using a selector that writes to the console and use count as a way to run the query for sideeffects only i think we should not take the change and check for other changes like this that we might have accepted in the past like here is what i think is the root cause we do have a method for obtaining counts internally getcount bool onlyifcheap i think the method should not be used to bypass selectors when actually running the query it is ok to use it to preallocate internal buffers that still have an assumption that the “cheap” way of obtaining the underlying “ count” is idempotent but it is a kind of assumption that we agreed in the past to be acceptable it does not seem to be acceptable to assume that user supplied selectors predicates are sideeffects free in a context of aggregating query i e – it would be ok to preallocate a buffer based on getcount bool onlyifcheap it would be ok to compute actual count via but only as long as there are no funcs to run if we have selectors predicates or other funcs they must run in aggregating queries so in other words we should undo all the getcount optimizations that skip running the selector if onlyifcheap if false but if it s true we re probably using it to preallocate a buffer of some sort are probably going to use it anyways so in those scenarios it should be ok to not run the selector cc jonhanna
0
61,443
14,627,760,138
IssuesEvent
2020-12-23 12:55:19
bitbar/test-samples
https://api.github.com/repos/bitbar/test-samples
opened
CVE-2019-12086 (High) detected in jackson-databind-2.6.0.jar
security vulnerability
## CVE-2019-12086 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: test-samples/samples/testing-frameworks/appium/server-side/image-recognition/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.0/jackson-databind-2.6.0.jar</p> <p> Dependency Hierarchy: - testdroid-api-2.38.jar (Root Library) - :x: **jackson-databind-2.6.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/bitbar/test-samples/commit/12af4f854b64888df6e4492ecc94e141388e939a">12af4f854b64888df6e4492ecc94e141388e939a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint, the service has the mysql-connector-java jar (8.0.14 or earlier) in the classpath, and an attacker can host a crafted MySQL server reachable by the victim, an attacker can send a crafted JSON message that allows them to read arbitrary local files on the server. This occurs because of missing com.mysql.cj.jdbc.admin.MiniAdmin validation. <p>Publish Date: 2019-05-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12086>CVE-2019-12086</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-12086">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-12086</a></p> <p>Release Date: 2019-05-17</p> <p>Fix Resolution: 2.9.9</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.0","isTransitiveDependency":true,"dependencyTree":"com.testdroid:testdroid-api:2.38;com.fasterxml.jackson.core:jackson-databind:2.6.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.9"}],"vulnerabilityIdentifier":"CVE-2019-12086","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint, the service has the mysql-connector-java jar (8.0.14 or earlier) in the classpath, and an attacker can host a crafted MySQL server reachable by the victim, an attacker can send a crafted JSON message that allows them to read arbitrary local files on the server. This occurs because of missing com.mysql.cj.jdbc.admin.MiniAdmin validation.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12086","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-12086 (High) detected in jackson-databind-2.6.0.jar - ## CVE-2019-12086 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: test-samples/samples/testing-frameworks/appium/server-side/image-recognition/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.0/jackson-databind-2.6.0.jar</p> <p> Dependency Hierarchy: - testdroid-api-2.38.jar (Root Library) - :x: **jackson-databind-2.6.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/bitbar/test-samples/commit/12af4f854b64888df6e4492ecc94e141388e939a">12af4f854b64888df6e4492ecc94e141388e939a</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint, the service has the mysql-connector-java jar (8.0.14 or earlier) in the classpath, and an attacker can host a crafted MySQL server reachable by the victim, an attacker can send a crafted JSON message that allows them to read arbitrary local files on the server. This occurs because of missing com.mysql.cj.jdbc.admin.MiniAdmin validation. <p>Publish Date: 2019-05-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12086>CVE-2019-12086</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-12086">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-12086</a></p> <p>Release Date: 2019-05-17</p> <p>Fix Resolution: 2.9.9</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.0","isTransitiveDependency":true,"dependencyTree":"com.testdroid:testdroid-api:2.38;com.fasterxml.jackson.core:jackson-databind:2.6.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.9"}],"vulnerabilityIdentifier":"CVE-2019-12086","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint, the service has the mysql-connector-java jar (8.0.14 or earlier) in the classpath, and an attacker can host a crafted MySQL server reachable by the victim, an attacker can send a crafted JSON message that allows them to read arbitrary local files on the server. This occurs because of missing com.mysql.cj.jdbc.admin.MiniAdmin validation.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12086","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file test samples samples testing frameworks appium server side image recognition pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy testdroid api jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind x before when default typing is enabled either globally or for a specific property for an externally exposed json endpoint the service has the mysql connector java jar or earlier in the classpath and an attacker can host a crafted mysql server reachable by the victim an attacker can send a crafted json message that allows them to read arbitrary local files on the server this occurs because of missing com mysql cj jdbc admin miniadmin validation publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a polymorphic typing issue was discovered in fasterxml jackson databind x before when default typing is enabled either globally or for a specific property for an externally exposed json endpoint the service has the mysql connector java jar or earlier in the classpath and an attacker can host a crafted mysql server reachable by the victim an attacker can send a crafted json message that allows them to read arbitrary local files on the server this occurs because of missing com mysql cj jdbc admin miniadmin validation vulnerabilityurl
0
33,864
7,276,569,042
IssuesEvent
2018-02-21 16:44:07
BCDevExchange/devex
https://api.github.com/repos/BCDevExchange/devex
closed
Attachments tab is not available....
defect
...in the proposal until after I hit the SAVE CHANGES button. Can we just make it so that all tabs are visible and available right from the get-go?
1.0
Attachments tab is not available.... - ...in the proposal until after I hit the SAVE CHANGES button. Can we just make it so that all tabs are visible and available right from the get-go?
non_process
attachments tab is not available in the proposal until after i hit the save changes button can we just make it so that all tabs are visible and available right from the get go
0
83
2,533,263,964
IssuesEvent
2015-01-23 21:57:18
MozillaFoundation/plan
https://api.github.com/repos/MozillaFoundation/plan
opened
Allow for Heartbeat planning (and bug triage) across repos
process
This might be impossible, but it would be awesome to solve the problem of: "I filed this issue in one repo, but now I want it to appear in the Plan. Do I have to refile it?" A good example of this is a bug like 'Build Call Congress tool': https://github.com/MozillaFoundation/movement-building/issues/4 Could we use a label, or some other marker, to pull bugs with certain tags from across repos into one central triage list – or is the best thing just to refile Meta tickets under the Plan repo? cc @simonwex @davidascher @thisandagain @OpenMatt
1.0
Allow for Heartbeat planning (and bug triage) across repos - This might be impossible, but it would be awesome to solve the problem of: "I filed this issue in one repo, but now I want it to appear in the Plan. Do I have to refile it?" A good example of this is a bug like 'Build Call Congress tool': https://github.com/MozillaFoundation/movement-building/issues/4 Could we use a label, or some other marker, to pull bugs with certain tags from across repos into one central triage list – or is the best thing just to refile Meta tickets under the Plan repo? cc @simonwex @davidascher @thisandagain @OpenMatt
process
allow for heartbeat planning and bug triage across repos this might be impossible but it would be awesome to solve the problem of i filed this issue in one repo but now i want it to appear in the plan do i have to refile it a good example of this is a bug like build call congress tool could we use a label or some other marker to pull bugs with certain tags from across repos into one central triage list – or is the best thing just to refile meta tickets under the plan repo cc simonwex davidascher thisandagain openmatt
1
7,203
10,340,319,654
IssuesEvent
2019-09-03 21:36:50
qri-io/desktop
https://api.github.com/repos/qri-io/desktop
closed
compiled binary isn't starting backend
bug main process
opening the app is bailing on "can't connect". We should also make sure log files from complied binary are writing to somewhere sane.
1.0
compiled binary isn't starting backend - opening the app is bailing on "can't connect". We should also make sure log files from complied binary are writing to somewhere sane.
process
compiled binary isn t starting backend opening the app is bailing on can t connect we should also make sure log files from complied binary are writing to somewhere sane
1
3,964
6,897,061,359
IssuesEvent
2017-11-23 22:32:54
PHPSocialNetwork/phpfastcache
https://api.github.com/repos/PHPSocialNetwork/phpfastcache
closed
Random "key does not exist on the server" messages
0_0 bug 7.0 [*_*] Debugging [-_-] In Process
### Configuration: PhpFastCache version: 7.0.0-alpha PHP version: 7.1.11 Operating system: Windows 2012 Datacentre #### Issue description: Using couchbase server as a memcached server, I'm getting rare, but random errors: Error Message: Uncaught Couchbase\Exception: LCB_KEY_ENOENT: The key does not exist on the server in W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php:128 Stack trace: #0 W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php(128): Couchbase\Bucket->remove('05f4b82707ebca6...') #1 W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Core\Pool\CacheItemPoolTrait.php(153): phpFastCache\Drivers\Couchbase\Driver->driverDelete(Object(phpFastCache\Drivers\Couchbase\Item)) #2 W:\web\includes\library\custom.functions.php(70): phpFastCache\Drivers\Couchbase\Driver->getItem('profile_timelin...') #3 W:\web\platform\profile\pages\default.php(212): cache(Array) #4 W:\web\platform\profile\index.php(116): require('W:\\web\\platform...') #5 W:\web\platform\error\404.php(88): require('W:\\web\\platform...') #6 {main} thrown Error File: W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php Error Line: 128 By default, keys are set to expire after 600 seconds, and I'm getting around 100,000 page views a day on the server with around 10 cached requests per page - I'm going to guess it's a resource / memory issue - but should driver.php gracefully handle the non-existent key or error out as above? Thanks G!
1.0
Random "key does not exist on the server" messages - ### Configuration: PhpFastCache version: 7.0.0-alpha PHP version: 7.1.11 Operating system: Windows 2012 Datacentre #### Issue description: Using couchbase server as a memcached server, I'm getting rare, but random errors: Error Message: Uncaught Couchbase\Exception: LCB_KEY_ENOENT: The key does not exist on the server in W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php:128 Stack trace: #0 W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php(128): Couchbase\Bucket->remove('05f4b82707ebca6...') #1 W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Core\Pool\CacheItemPoolTrait.php(153): phpFastCache\Drivers\Couchbase\Driver->driverDelete(Object(phpFastCache\Drivers\Couchbase\Item)) #2 W:\web\includes\library\custom.functions.php(70): phpFastCache\Drivers\Couchbase\Driver->getItem('profile_timelin...') #3 W:\web\platform\profile\pages\default.php(212): cache(Array) #4 W:\web\platform\profile\index.php(116): require('W:\\web\\platform...') #5 W:\web\platform\error\404.php(88): require('W:\\web\\platform...') #6 {main} thrown Error File: W:\web\assets\plugins\phpFastCache.v7\src\phpFastCache\Drivers\Couchbase\Driver.php Error Line: 128 By default, keys are set to expire after 600 seconds, and I'm getting around 100,000 page views a day on the server with around 10 cached requests per page - I'm going to guess it's a resource / memory issue - but should driver.php gracefully handle the non-existent key or error out as above? Thanks G!
process
random key does not exist on the server messages configuration phpfastcache version alpha php version operating system windows datacentre issue description using couchbase server as a memcached server i m getting rare but random errors error message uncaught couchbase exception lcb key enoent the key does not exist on the server in w web assets plugins phpfastcache src phpfastcache drivers couchbase driver php stack trace w web assets plugins phpfastcache src phpfastcache drivers couchbase driver php couchbase bucket remove w web assets plugins phpfastcache src phpfastcache core pool cacheitempooltrait php phpfastcache drivers couchbase driver driverdelete object phpfastcache drivers couchbase item w web includes library custom functions php phpfastcache drivers couchbase driver getitem profile timelin w web platform profile pages default php cache array w web platform profile index php require w web platform w web platform error php require w web platform main thrown error file w web assets plugins phpfastcache src phpfastcache drivers couchbase driver php error line by default keys are set to expire after seconds and i m getting around page views a day on the server with around cached requests per page i m going to guess it s a resource memory issue but should driver php gracefully handle the non existent key or error out as above thanks g
1
586,555
17,580,633,329
IssuesEvent
2021-08-16 06:53:26
lewisjwilson/kmj
https://api.github.com/repos/lewisjwilson/kmj
opened
Refactor List Names Storage
bug Medium Priority
Rather than storing in sharedprefs, store in a new db table: ``` CREATE TABLE lists (id INTEGER PRIMARY KEY AUTOINCREMENT, list_name TEXT, FOREIGN KEY (id) REFERENCES jisho_data (list)) ``` To display matching list results: ``` SELECT jisho_data.* FROM jisho_data INNER JOIN lists ON lists.id = jisho_data.list AND lists.id = ${user_selected_list_id} ```
1.0
Refactor List Names Storage - Rather than storing in sharedprefs, store in a new db table: ``` CREATE TABLE lists (id INTEGER PRIMARY KEY AUTOINCREMENT, list_name TEXT, FOREIGN KEY (id) REFERENCES jisho_data (list)) ``` To display matching list results: ``` SELECT jisho_data.* FROM jisho_data INNER JOIN lists ON lists.id = jisho_data.list AND lists.id = ${user_selected_list_id} ```
non_process
refactor list names storage rather than storing in sharedprefs store in a new db table create table lists id integer primary key autoincrement list name text foreign key id references jisho data list to display matching list results select jisho data from jisho data inner join lists on lists id jisho data list and lists id user selected list id
0
20,849
27,626,935,359
IssuesEvent
2023-03-10 07:45:20
zotero/zotero
https://api.github.com/repos/zotero/zotero
opened
Support converting Mendeley Cite citations in Word
Word Processor Integration
I haven't checked if the APIs are available for Mac Word. They definitely are for Windows. Either way, this would require prompting users that Mendeley Cite citations have been detected and Zotero can import/convert them, but that they need to make a backup copy, because they won't be able to work with Mendeley Cite again.
1.0
Support converting Mendeley Cite citations in Word - I haven't checked if the APIs are available for Mac Word. They definitely are for Windows. Either way, this would require prompting users that Mendeley Cite citations have been detected and Zotero can import/convert them, but that they need to make a backup copy, because they won't be able to work with Mendeley Cite again.
process
support converting mendeley cite citations in word i haven t checked if the apis are available for mac word they definitely are for windows either way this would require prompting users that mendeley cite citations have been detected and zotero can import convert them but that they need to make a backup copy because they won t be able to work with mendeley cite again
1
20,383
27,039,564,686
IssuesEvent
2023-02-13 03:17:05
open-telemetry/opentelemetry-collector
https://api.github.com/repos/open-telemetry/opentelemetry-collector
closed
Supporting Key and a List of Values in the Attribute Processor
enhancement Stale priority:p3 spec:trace release:after-ga area:processor
Support for key and list based values in the attribute processor. Use case: To support list of homogeneous values under a single key. Example Configuration(otel-collector-config): ``` attributes/priority_sampling: include: match_type: strict attributes: - key: http.url value: - 'http://productcatalogservice/GetProducts' - 'http://recommendationservice/GetRecommendations' - 'http://cartservice/GetCart' actions: - key: sampling.priority value: 1 ``` All the values under http.url should have a sampling.priority value as 1. Instead, current implementation of OTEL is not supporting this feature and triggers a bug. Steps to produce: 1. Add otel collector config 2. Execute the demo app Config file: - `otel-collector-config.yaml` ``` receivers: opencensus: exporters: prometheus: endpoint: "0.0.0.0:8889" namespace: promexample const_labels: label1: value1 logging: zipkin: endpoint: "http://zipkin-all-in-one:9411/api/v2/spans" format: proto jaeger: endpoint: jaeger-all-in-one:14250 insecure: true processors: batch: queued_retry: attributes/priority_sampling: include: match_type: strict attributes: - key: http.url value: - 'http://productcatalogservice/GetProducts' - 'http://recommendationservice/GetRecommendations' - 'http://cartservice/GetCart' actions: - key: sampling.priority value: 1 action: upsert extensions: health_check: pprof: endpoint: :1888 zpages: endpoint: :55679 service: extensions: [pprof, zpages, health_check] pipelines: traces: receivers: [opencensus] processors: [batch, queued_retry, attributes/priority_sampling] exporters: [logging, zipkin, jaeger] metrics: receivers: [opencensus] processors: [batch] exporters: [logging,prometheus] ``` ``` Error: cannot setup pipelines: cannot build pipelines: error creating processor "attributes/priority_sampling" in pipeline "traces": error creating attribute filters: error unsupported value type "[]interface {}" ``` ![image](https://user-images.githubusercontent.com/54934534/95706073-75ad5c00-0c73-11eb-99ac-12f7eb24d478.png) ![image](https://user-images.githubusercontent.com/54934534/95706084-7e9e2d80-0c73-11eb-92ff-5a6c8478695e.png) PR will be added for supporting and resolving the same. This will enable list of values (homogeneous) under a key for attribute processors.
1.0
Supporting Key and a List of Values in the Attribute Processor - Support for key and list based values in the attribute processor. Use case: To support list of homogeneous values under a single key. Example Configuration(otel-collector-config): ``` attributes/priority_sampling: include: match_type: strict attributes: - key: http.url value: - 'http://productcatalogservice/GetProducts' - 'http://recommendationservice/GetRecommendations' - 'http://cartservice/GetCart' actions: - key: sampling.priority value: 1 ``` All the values under http.url should have a sampling.priority value as 1. Instead, current implementation of OTEL is not supporting this feature and triggers a bug. Steps to produce: 1. Add otel collector config 2. Execute the demo app Config file: - `otel-collector-config.yaml` ``` receivers: opencensus: exporters: prometheus: endpoint: "0.0.0.0:8889" namespace: promexample const_labels: label1: value1 logging: zipkin: endpoint: "http://zipkin-all-in-one:9411/api/v2/spans" format: proto jaeger: endpoint: jaeger-all-in-one:14250 insecure: true processors: batch: queued_retry: attributes/priority_sampling: include: match_type: strict attributes: - key: http.url value: - 'http://productcatalogservice/GetProducts' - 'http://recommendationservice/GetRecommendations' - 'http://cartservice/GetCart' actions: - key: sampling.priority value: 1 action: upsert extensions: health_check: pprof: endpoint: :1888 zpages: endpoint: :55679 service: extensions: [pprof, zpages, health_check] pipelines: traces: receivers: [opencensus] processors: [batch, queued_retry, attributes/priority_sampling] exporters: [logging, zipkin, jaeger] metrics: receivers: [opencensus] processors: [batch] exporters: [logging,prometheus] ``` ``` Error: cannot setup pipelines: cannot build pipelines: error creating processor "attributes/priority_sampling" in pipeline "traces": error creating attribute filters: error unsupported value type "[]interface {}" ``` ![image](https://user-images.githubusercontent.com/54934534/95706073-75ad5c00-0c73-11eb-99ac-12f7eb24d478.png) ![image](https://user-images.githubusercontent.com/54934534/95706084-7e9e2d80-0c73-11eb-92ff-5a6c8478695e.png) PR will be added for supporting and resolving the same. This will enable list of values (homogeneous) under a key for attribute processors.
process
supporting key and a list of values in the attribute processor support for key and list based values in the attribute processor use case to support list of homogeneous values under a single key example configuration otel collector config attributes priority sampling include match type strict attributes key http url value actions key sampling priority value all the values under http url should have a sampling priority value as instead current implementation of otel is not supporting this feature and triggers a bug steps to produce add otel collector config execute the demo app config file otel collector config yaml receivers opencensus exporters prometheus endpoint namespace promexample const labels logging zipkin endpoint format proto jaeger endpoint jaeger all in one insecure true processors batch queued retry attributes priority sampling include match type strict attributes key http url value actions key sampling priority value action upsert extensions health check pprof endpoint zpages endpoint service extensions pipelines traces receivers processors exporters metrics receivers processors exporters error cannot setup pipelines cannot build pipelines error creating processor attributes priority sampling in pipeline traces error creating attribute filters error unsupported value type interface pr will be added for supporting and resolving the same this will enable list of values homogeneous under a key for attribute processors
1
360,505
10,693,569,888
IssuesEvent
2019-10-23 09:06:21
sofittech/palcare-project
https://api.github.com/repos/sofittech/palcare-project
opened
Citizen Signup Glitch: (see details and video)
Citizen Hi Priority Web iOS
Citizen Sign up -> Successful Verification -> ACD -> Next arrow -> **Glitchh: Shows log in Screen** and then shows home screen again. Video Link: https://we.tl/t-BGTHd72mZ4
1.0
Citizen Signup Glitch: (see details and video) - Citizen Sign up -> Successful Verification -> ACD -> Next arrow -> **Glitchh: Shows log in Screen** and then shows home screen again. Video Link: https://we.tl/t-BGTHd72mZ4
non_process
citizen signup glitch see details and video citizen sign up successful verification acd next arrow glitchh shows log in screen and then shows home screen again video link
0
243,336
18,683,229,851
IssuesEvent
2021-11-01 09:06:01
Picovoice/picovoice
https://api.github.com/repos/Picovoice/picovoice
opened
Picovoice Documentation Issue: Limitations of Personal Acounts
documentation
### What is the URL of the doc? https://picovoice.ai/pricing/ https://github.com/Picovoice/rhino/issues/65 ### What's the nature of the issue? I have a non-commercial project in which I am tasked with creating a working product which will be used by individuals. The product I'm tasked to create is a mechanical hand using a raspberry pi that will open and close once someone says "open" or "close". Before the speech command there should be a 'wake word' whose task is to trigger the speech recognition process, this is needed in order to reduce the amount of false positives and reduce the CPU usage which will reduce battery consumption. Thus, your software was great for my needs. Nevertheless, the personal license which is for non-commercial purposes has a few limitation which I wasn't aware of only after spending a day of work on making your product work: 1. Apparently, you only have a 30 day expiration date on rihno and porcupine files generated via the console. 2. The porcupine only allows for popular operating systems such as, linux, mac and windows but is restricted for less common operating systems such as that of the raspberry pi. These limitations (and there might be others I'm not aware of), disallow me to use your software for my project to my sorrow. If you could provide much more clarity on this topic upfront it would waste much less time and energy, and hopefully remove these restrictions for non-commercial usages.
1.0
Picovoice Documentation Issue: Limitations of Personal Acounts - ### What is the URL of the doc? https://picovoice.ai/pricing/ https://github.com/Picovoice/rhino/issues/65 ### What's the nature of the issue? I have a non-commercial project in which I am tasked with creating a working product which will be used by individuals. The product I'm tasked to create is a mechanical hand using a raspberry pi that will open and close once someone says "open" or "close". Before the speech command there should be a 'wake word' whose task is to trigger the speech recognition process, this is needed in order to reduce the amount of false positives and reduce the CPU usage which will reduce battery consumption. Thus, your software was great for my needs. Nevertheless, the personal license which is for non-commercial purposes has a few limitation which I wasn't aware of only after spending a day of work on making your product work: 1. Apparently, you only have a 30 day expiration date on rihno and porcupine files generated via the console. 2. The porcupine only allows for popular operating systems such as, linux, mac and windows but is restricted for less common operating systems such as that of the raspberry pi. These limitations (and there might be others I'm not aware of), disallow me to use your software for my project to my sorrow. If you could provide much more clarity on this topic upfront it would waste much less time and energy, and hopefully remove these restrictions for non-commercial usages.
non_process
picovoice documentation issue limitations of personal acounts what is the url of the doc what s the nature of the issue i have a non commercial project in which i am tasked with creating a working product which will be used by individuals the product i m tasked to create is a mechanical hand using a raspberry pi that will open and close once someone says open or close before the speech command there should be a wake word whose task is to trigger the speech recognition process this is needed in order to reduce the amount of false positives and reduce the cpu usage which will reduce battery consumption thus your software was great for my needs nevertheless the personal license which is for non commercial purposes has a few limitation which i wasn t aware of only after spending a day of work on making your product work apparently you only have a day expiration date on rihno and porcupine files generated via the console the porcupine only allows for popular operating systems such as linux mac and windows but is restricted for less common operating systems such as that of the raspberry pi these limitations and there might be others i m not aware of disallow me to use your software for my project to my sorrow if you could provide much more clarity on this topic upfront it would waste much less time and energy and hopefully remove these restrictions for non commercial usages
0
9,289
12,305,900,323
IssuesEvent
2020-05-11 23:48:46
medic/cht-core
https://api.github.com/repos/medic/cht-core
opened
Release 3.10.0
Type: Internal process
# Planning - [ ] Create an [organisation wide project](https://github.com/orgs/medic/projects?query=is%3Aopen+sort%3Aname-asc) and add this issue to it. We use [semver](http://semver.org) so if there are breaking changes increment the major, otherwise if there are new features increment the minor, otherwise increment the service pack. Breaking changes in our case relate to updated software requirements (egs: CouchDB, node, minimum browser versions), broken backwards compatibility in an api, or a major visual update that requires user retraining. - [ ] Add all the issues to be worked on to the project. Ideally each minor release will have one or two features, a handful of improvements, and plenty of bugs. # Development When development is ready to begin one of the engineers should be nominated as a Release Manager. They will be responsible for making sure the following tasks are completed though not necessarily completing them. - [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <major|minor>`. - [ ] Raise a new issue called `Update dependencies for <version>` with a description that links to [the documentation](https://github.com/medic/medic-docs/blob/master/development/update-dependencies.md). This should be done early in the release cycle so find a volunteer to take this on and assign it to them. - [ ] Write an update in the weekly Product Team call agenda summarising development and acceptance testing progress and identifying any blockers. The release manager is to update this every week until the version is released. # Releasing Once all issues have passed acceptance testing and have been merged into `master` release testing can begin. - [ ] Create a new release branch from `master` named `<major>.<minor>.x` in medic. Post a message to #development using this template: ``` @core_devs I've just created the `<major>.<minor>.x` release branch. Please be aware that any further changes intended for this release will have to be merged to `master` then backported. Thanks! ``` - [ ] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing. - [ ] [Import translations keys](https://github.com/medic/medic-docs/blob/master/development/translations.md#adding-new-keys) into POE and notify the #translations Slack channel translate new and updated values, for example: ``` @channel I've just updated the translations in POE. These keys have been added: "<added-list>", and these keys have been updated: "<updated-list>" ``` - [ ] Create a new document in the [release-notes folder](https://github.com/medic/medic/tree/master/release-notes) in `master`. Ensure all issues are in the GH Project, that they're correct labelled, and have human readable descriptions. Use [this script](https://github.com/medic/medic/blob/master/scripts/changelog-generator) to export the issues into our changelog format. Manually document any known migration steps and known issues. Provide description, screenshots, videos, and anything else to help communicate particularly important changes. Document any required or recommended upgrades to our other products (eg: medic-conf, medic-gateway, medic-android). Assign the PR to a) the Director of Technology, and b) an SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient. - [ ] Create a Google Doc in the [blog posts folder](https://drive.google.com/drive/u/0/folders/0B2PTUNZFwxEvMHRWNTBjY2ZHNHc) with the draft of a blog post promoting the release based on the release notes above. Once it's ready ask Alix to review it. - [ ] Until release testing passes, make sure regressions are fixed in `master`, cherry-pick them into the release branch, and release another beta. - [ ] [Export the translations](https://github.com/medic/medic-docs/blob/master/development/translations.md#exporting-changes-from-poeditor-to-github), delete empty translation files and commit to `master`. Cherry-pick the commit into the release branch. - [ ] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/medic/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Link to the release notes in the description of the release. - [ ] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>` - [ ] Follow the instructions for [releasing other products](https://github.com/medic/medic-docs/blob/master/development/releasing.md) that have been updated in this project (eg: medic-conf, medic-gateway, medic-android). - [ ] Add the release to the [Supported versions](https://github.com/medic/medic-docs/blob/master/installation/supported-software.md#supported-versions) and update the EOL date and status of previous releases. - [ ] Announce the release in #products and #cht-contributors using this template: ``` @channel *We're excited to announce the release of {{version}}* New features include {{key_features}}. We've also implemented loads of other improvements and fixed a heap of bugs. Read the release notes for full details: {{url}} Following our support policy, versions {{versions}} are no longer supported. Projects running these versions should start planning to upgrade in the near future. For more details read our software support documentation: https://github.com/medic/medic-docs/blob/master/installation/supported-software.md#supported-versions To see what's scheduled for the next releases have a read of the product roadmap: https://github.com/orgs/medic/projects?query=is%3Aopen+sort%3Aname-asc ``` - [ ] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/), under the Development category. You can use the previous message and omit `@channel`. - [ ] Mark this issue "done" and close the project.
1.0
Release 3.10.0 - # Planning - [ ] Create an [organisation wide project](https://github.com/orgs/medic/projects?query=is%3Aopen+sort%3Aname-asc) and add this issue to it. We use [semver](http://semver.org) so if there are breaking changes increment the major, otherwise if there are new features increment the minor, otherwise increment the service pack. Breaking changes in our case relate to updated software requirements (egs: CouchDB, node, minimum browser versions), broken backwards compatibility in an api, or a major visual update that requires user retraining. - [ ] Add all the issues to be worked on to the project. Ideally each minor release will have one or two features, a handful of improvements, and plenty of bugs. # Development When development is ready to begin one of the engineers should be nominated as a Release Manager. They will be responsible for making sure the following tasks are completed though not necessarily completing them. - [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <major|minor>`. - [ ] Raise a new issue called `Update dependencies for <version>` with a description that links to [the documentation](https://github.com/medic/medic-docs/blob/master/development/update-dependencies.md). This should be done early in the release cycle so find a volunteer to take this on and assign it to them. - [ ] Write an update in the weekly Product Team call agenda summarising development and acceptance testing progress and identifying any blockers. The release manager is to update this every week until the version is released. # Releasing Once all issues have passed acceptance testing and have been merged into `master` release testing can begin. - [ ] Create a new release branch from `master` named `<major>.<minor>.x` in medic. Post a message to #development using this template: ``` @core_devs I've just created the `<major>.<minor>.x` release branch. Please be aware that any further changes intended for this release will have to be merged to `master` then backported. Thanks! ``` - [ ] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing. - [ ] [Import translations keys](https://github.com/medic/medic-docs/blob/master/development/translations.md#adding-new-keys) into POE and notify the #translations Slack channel translate new and updated values, for example: ``` @channel I've just updated the translations in POE. These keys have been added: "<added-list>", and these keys have been updated: "<updated-list>" ``` - [ ] Create a new document in the [release-notes folder](https://github.com/medic/medic/tree/master/release-notes) in `master`. Ensure all issues are in the GH Project, that they're correct labelled, and have human readable descriptions. Use [this script](https://github.com/medic/medic/blob/master/scripts/changelog-generator) to export the issues into our changelog format. Manually document any known migration steps and known issues. Provide description, screenshots, videos, and anything else to help communicate particularly important changes. Document any required or recommended upgrades to our other products (eg: medic-conf, medic-gateway, medic-android). Assign the PR to a) the Director of Technology, and b) an SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient. - [ ] Create a Google Doc in the [blog posts folder](https://drive.google.com/drive/u/0/folders/0B2PTUNZFwxEvMHRWNTBjY2ZHNHc) with the draft of a blog post promoting the release based on the release notes above. Once it's ready ask Alix to review it. - [ ] Until release testing passes, make sure regressions are fixed in `master`, cherry-pick them into the release branch, and release another beta. - [ ] [Export the translations](https://github.com/medic/medic-docs/blob/master/development/translations.md#exporting-changes-from-poeditor-to-github), delete empty translation files and commit to `master`. Cherry-pick the commit into the release branch. - [ ] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/medic/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Link to the release notes in the description of the release. - [ ] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>` - [ ] Follow the instructions for [releasing other products](https://github.com/medic/medic-docs/blob/master/development/releasing.md) that have been updated in this project (eg: medic-conf, medic-gateway, medic-android). - [ ] Add the release to the [Supported versions](https://github.com/medic/medic-docs/blob/master/installation/supported-software.md#supported-versions) and update the EOL date and status of previous releases. - [ ] Announce the release in #products and #cht-contributors using this template: ``` @channel *We're excited to announce the release of {{version}}* New features include {{key_features}}. We've also implemented loads of other improvements and fixed a heap of bugs. Read the release notes for full details: {{url}} Following our support policy, versions {{versions}} are no longer supported. Projects running these versions should start planning to upgrade in the near future. For more details read our software support documentation: https://github.com/medic/medic-docs/blob/master/installation/supported-software.md#supported-versions To see what's scheduled for the next releases have a read of the product roadmap: https://github.com/orgs/medic/projects?query=is%3Aopen+sort%3Aname-asc ``` - [ ] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/), under the Development category. You can use the previous message and omit `@channel`. - [ ] Mark this issue "done" and close the project.
process
release planning create an and add this issue to it we use so if there are breaking changes increment the major otherwise if there are new features increment the minor otherwise increment the service pack breaking changes in our case relate to updated software requirements egs couchdb node minimum browser versions broken backwards compatibility in an api or a major visual update that requires user retraining add all the issues to be worked on to the project ideally each minor release will have one or two features a handful of improvements and plenty of bugs development when development is ready to begin one of the engineers should be nominated as a release manager they will be responsible for making sure the following tasks are completed though not necessarily completing them set the version number in package json and package lock json and submit a pr the easiest way to do this is to use npm no git tag version version raise a new issue called update dependencies for with a description that links to this should be done early in the release cycle so find a volunteer to take this on and assign it to them write an update in the weekly product team call agenda summarising development and acceptance testing progress and identifying any blockers the release manager is to update this every week until the version is released releasing once all issues have passed acceptance testing and have been merged into master release testing can begin create a new release branch from master named x in medic post a message to development using this template core devs i ve just created the x release branch please be aware that any further changes intended for this release will have to be merged to master then backported thanks build a beta named beta by pushing a git tag and when ci completes successfully notify the qa team that it s ready for release testing into poe and notify the translations slack channel translate new and updated values for example channel i ve just updated the translations in poe these keys have been added and these keys have been updated create a new document in the in master ensure all issues are in the gh project that they re correct labelled and have human readable descriptions use to export the issues into our changelog format manually document any known migration steps and known issues provide description screenshots videos and anything else to help communicate particularly important changes document any required or recommended upgrades to our other products eg medic conf medic gateway medic android assign the pr to a the director of technology and b an sre to review and confirm the documentation on upgrade instructions and breaking changes is sufficient create a google doc in the with the draft of a blog post promoting the release based on the release notes above once it s ready ask alix to review it until release testing passes make sure regressions are fixed in master cherry pick them into the release branch and release another beta delete empty translation files and commit to master cherry pick the commit into the release branch create a release in github from the release branch so it shows up under the with the naming convention this will create the git tag automatically link to the release notes in the description of the release confirm the release build completes successfully and the new release is available on the make sure that the document has new entry with id medic medic follow the instructions for that have been updated in this project eg medic conf medic gateway medic android add the release to the and update the eol date and status of previous releases announce the release in products and cht contributors using this template channel we re excited to announce the release of version new features include key features we ve also implemented loads of other improvements and fixed a heap of bugs read the release notes for full details url following our support policy versions versions are no longer supported projects running these versions should start planning to upgrade in the near future for more details read our software support documentation to see what s scheduled for the next releases have a read of the product roadmap announce the release on the under the development category you can use the previous message and omit channel mark this issue done and close the project
1
15,465
19,680,776,441
IssuesEvent
2022-01-11 16:33:04
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
Unification App: Write E2E tests around "Top Nav"
stage: pending release process: tests type: chore
### What would you like? Write end-to-end tests to cover the new Unification work in 10.0-release branch for "[Top Nav](https://docs.google.com/spreadsheets/d/1iPwi89aW6aYeA0VT1XOhYdAWLuScW0okrlfcL9fzh3s/edit#gid=0)" in the App. ### Why is this needed? _No response_ ### Other _No response_
1.0
Unification App: Write E2E tests around "Top Nav" - ### What would you like? Write end-to-end tests to cover the new Unification work in 10.0-release branch for "[Top Nav](https://docs.google.com/spreadsheets/d/1iPwi89aW6aYeA0VT1XOhYdAWLuScW0okrlfcL9fzh3s/edit#gid=0)" in the App. ### Why is this needed? _No response_ ### Other _No response_
process
unification app write tests around top nav what would you like write end to end tests to cover the new unification work in release branch for in the app why is this needed no response other no response
1
461,883
13,237,739,196
IssuesEvent
2020-08-18 22:21:46
near/near-explorer
https://api.github.com/repos/near/near-explorer
closed
Viewing transactions on an account page shows wrong account when scrolling
Priority 0 bug
I was here looking for transactions on my account: https://explorer.testnet.near.org/accounts/oracle-node.p.mike.testnet and as I scrolled down I started seeing transactions from another account that was not mine. Something seems to be wrong with the paging/infinite-scroll. <img width="790" alt="Screen Shot 2020-08-17 at 9 55 27 PM" src="https://user-images.githubusercontent.com/1042667/90472146-b30ec680-e0d4-11ea-8e96-775834d9b268.png">
1.0
Viewing transactions on an account page shows wrong account when scrolling - I was here looking for transactions on my account: https://explorer.testnet.near.org/accounts/oracle-node.p.mike.testnet and as I scrolled down I started seeing transactions from another account that was not mine. Something seems to be wrong with the paging/infinite-scroll. <img width="790" alt="Screen Shot 2020-08-17 at 9 55 27 PM" src="https://user-images.githubusercontent.com/1042667/90472146-b30ec680-e0d4-11ea-8e96-775834d9b268.png">
non_process
viewing transactions on an account page shows wrong account when scrolling i was here looking for transactions on my account and as i scrolled down i started seeing transactions from another account that was not mine something seems to be wrong with the paging infinite scroll img width alt screen shot at pm src
0
5,490
8,359,966,156
IssuesEvent
2018-10-03 09:57:26
aiidateam/aiida_core
https://api.github.com/repos/aiidateam/aiida_core
closed
Requirement of wallclock
requires discussion topic/JobCalculationAndProcess
There is now a requirement of supplying `max_wallclock_seconds`. This might make sense for some cluster, others not. Also, it now also wants this for typical local runs (e.g. for tests). Maybe we at least can remove this requirement for the local scheduler?
1.0
Requirement of wallclock - There is now a requirement of supplying `max_wallclock_seconds`. This might make sense for some cluster, others not. Also, it now also wants this for typical local runs (e.g. for tests). Maybe we at least can remove this requirement for the local scheduler?
process
requirement of wallclock there is now a requirement of supplying max wallclock seconds this might make sense for some cluster others not also it now also wants this for typical local runs e g for tests maybe we at least can remove this requirement for the local scheduler
1
13,836
8,378,737,374
IssuesEvent
2018-10-06 17:18:59
AndroidHardening/hardened_malloc
https://api.github.com/repos/AndroidHardening/hardened_malloc
closed
implement in-place growth for large allocations
enhancement performance
It will add an extra system call with synchronization when it fails, so it isn't necessarily a good idea at the smallest sizes if it almost always fails in practice. The Linux mmap heap uses best-fit and grows downwards which tends to eliminate common cases where this can actually work. It's also unclear if in-place growth is a neat positive or negative for security, similar to in-place shrinking. It's probably best to focus only on the performance aspects of this for now because there are too many variables when it comes to security. A large virtual memory quarantine could potentially turn avoiding in-place growth into a security feature so there may end up being an option to disable the optimizations.
True
implement in-place growth for large allocations - It will add an extra system call with synchronization when it fails, so it isn't necessarily a good idea at the smallest sizes if it almost always fails in practice. The Linux mmap heap uses best-fit and grows downwards which tends to eliminate common cases where this can actually work. It's also unclear if in-place growth is a neat positive or negative for security, similar to in-place shrinking. It's probably best to focus only on the performance aspects of this for now because there are too many variables when it comes to security. A large virtual memory quarantine could potentially turn avoiding in-place growth into a security feature so there may end up being an option to disable the optimizations.
non_process
implement in place growth for large allocations it will add an extra system call with synchronization when it fails so it isn t necessarily a good idea at the smallest sizes if it almost always fails in practice the linux mmap heap uses best fit and grows downwards which tends to eliminate common cases where this can actually work it s also unclear if in place growth is a neat positive or negative for security similar to in place shrinking it s probably best to focus only on the performance aspects of this for now because there are too many variables when it comes to security a large virtual memory quarantine could potentially turn avoiding in place growth into a security feature so there may end up being an option to disable the optimizations
0
18,575
24,556,403,242
IssuesEvent
2022-10-12 16:12:34
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Android] Eligibilty screens are getting displayed in the mobile app in the following scenario
Bug P0 Android Process: Fixed Process: Tested QA Process: Tested dev
**Steps:** 1. In SB, Edit the study which was enrolled by the participant in the mobile app 2. Update 'Enforce e-consent flow' again for enrolled participants and Publish the study 3. Now go back to mobile app 4. Click on 'Get started' button in the mobile app 5. Click on the study which was already enrolled and made changes from the SB 5. Click on 'Participate' button and Sign in to the app 6. Click on 'Exit' button in the consent flow and Verify **AR:** Eligibilty screens are getting displayed in the mobile app in the following scenario **ER:** Eligibilty screen should not get displayed in the mobile app in the following scenario, only updated consent flow should get displayed
3.0
[Android] Eligibilty screens are getting displayed in the mobile app in the following scenario - **Steps:** 1. In SB, Edit the study which was enrolled by the participant in the mobile app 2. Update 'Enforce e-consent flow' again for enrolled participants and Publish the study 3. Now go back to mobile app 4. Click on 'Get started' button in the mobile app 5. Click on the study which was already enrolled and made changes from the SB 5. Click on 'Participate' button and Sign in to the app 6. Click on 'Exit' button in the consent flow and Verify **AR:** Eligibilty screens are getting displayed in the mobile app in the following scenario **ER:** Eligibilty screen should not get displayed in the mobile app in the following scenario, only updated consent flow should get displayed
process
eligibilty screens are getting displayed in the mobile app in the following scenario steps in sb edit the study which was enrolled by the participant in the mobile app update enforce e consent flow again for enrolled participants and publish the study now go back to mobile app click on get started button in the mobile app click on the study which was already enrolled and made changes from the sb click on participate button and sign in to the app click on exit button in the consent flow and verify ar eligibilty screens are getting displayed in the mobile app in the following scenario er eligibilty screen should not get displayed in the mobile app in the following scenario only updated consent flow should get displayed
1
11,831
14,655,316,581
IssuesEvent
2020-12-28 10:43:36
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Mobile Apps] Time of day response attribute> Value for time of day is inconsistent in Android in response server
Android Bug P2 Process: Release 2 iOS
For 'Time of day' response attributes value is showing inconsistently in android and iOS. Android: ![Android_Time](https://user-images.githubusercontent.com/60386291/96561205-e2140500-12dc-11eb-86b4-cd880cbeb3ba.png) iOS: ![iOS_Timeofday](https://user-images.githubusercontent.com/60386291/96561236-ea6c4000-12dc-11eb-9269-00fe4a9c7d2d.png)
1.0
[Mobile Apps] Time of day response attribute> Value for time of day is inconsistent in Android in response server - For 'Time of day' response attributes value is showing inconsistently in android and iOS. Android: ![Android_Time](https://user-images.githubusercontent.com/60386291/96561205-e2140500-12dc-11eb-86b4-cd880cbeb3ba.png) iOS: ![iOS_Timeofday](https://user-images.githubusercontent.com/60386291/96561236-ea6c4000-12dc-11eb-9269-00fe4a9c7d2d.png)
process
time of day response attribute value for time of day is inconsistent in android in response server for time of day response attributes value is showing inconsistently in android and ios android ios
1
14,301
17,289,231,961
IssuesEvent
2021-07-24 11:17:21
metabase/metabase
https://api.github.com/repos/metabase/metabase
opened
Pivot Table errors if question is sorted by a breakout column
Priority:P2 Querying/Processor Type:Bug Visualization/Tables
**Describe the bug** Pivot Table errors if question is sorted by a breakout/group-by column. It works if grouping by aggregation/metric. Workaround is to not have sorting on the initial question, but apply sorting in the visualization settings sidebar for Pivot Table. **To Reproduce** 1. Custom question > Sample Dataset > Products 2. Summarize Count by Category, and Sort by Category - visualize ![image](https://user-images.githubusercontent.com/1447303/126866647-da335758-f38c-446e-8a42-ef8e8fa68dfb.png) 3. Visualization > select Pivot Table - query fails with `Column """source"".CATEGORY" must be in the GROUP BY list` (or `ERROR: column "source.name" must appear in the GROUP BY clause or be used in an aggregate function` depending on database type) <details><summary>Full stacktrace</summary> ``` 2021-07-24 13:08:36,734 ERROR middleware.catch-exceptions :: Error processing query: null {:database_id 4, :started_at #t "2021-07-24T13:08:34.653521+02:00[Europe/Copenhagen]", :via [{:status :failed, :class clojure.lang.ExceptionInfo, :error "Error executing query", :stacktrace ["--> driver.sql_jdbc.execute$execute_reducible_query$fn__80802.invoke(execute.clj:480)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38255$fn__38258.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38255.invoke(reducible.clj:134)" "query_processor.pivot$process_query_append_results.invokeStatic(pivot.clj:131)" "query_processor.pivot$process_query_append_results.invoke(pivot.clj:118)" "query_processor.pivot$process_queries_append_results$fn__55728.invoke(pivot.clj:142)" "query_processor.pivot$process_queries_append_results.invokeStatic(pivot.clj:140)" "query_processor.pivot$process_queries_append_results.invoke(pivot.clj:137)" "query_processor.pivot$append_queries_context$fn__55732$fn__55733$fn__55734.invoke(pivot.clj:157)" "query_processor.middleware.process_userland_query$add_and_save_execution_info_xform_BANG_$execution_info_rf_STAR___49055.invoke(process_userland_query.clj:87)" "query_processor.middleware.results_metadata$insights_xform$combine__49947.invoke(results_metadata.clj:131)" "query_processor.reducible$combine_additional_reducing_fns$fn__38269.invoke(reducible.clj:206)" "query_processor.middleware.add_rows_truncated$add_rows_truncated_xform$fn__45162.invoke(add_rows_truncated.clj:20)" "query_processor.middleware.format_rows$format_rows_xform$fn__47812.invoke(format_rows.clj:65)" "query_processor.middleware.limit$limit_xform$fn__47871.invoke(limit.clj:21)" "query_processor.context.default$default_reducef$fn__37435.invoke(default.clj:58)" "query_processor.context.default$default_reducef.invokeStatic(default.clj:57)" "query_processor.context.default$default_reducef.invoke(default.clj:48)" "query_processor.context$reducef.invokeStatic(context.clj:69)" "query_processor.context$reducef.invoke(context.clj:62)" "query_processor.context.default$default_runf$respond_STAR___37439.invoke(default.clj:69)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:485)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246$fn__38249.invoke(reducible.clj:108)"], :error_type :invalid-query, :ex-data {:sql "-- Metabase:: userID: 1 queryType: MBQL queryHash: 6ea39a4dd0b199248a7b337cf22b368a2829f4917bf45b1a2526f624714f9ccc\nSELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC", :params nil, :type :invalid-query}}], :state "90016", :error_type :invalid-query, :json_query {:database 4, :query {:source-table 10, :aggregation [[:count]], :breakout ([:expression "pivot-grouping"]), :order-by [[:asc [:field 107 nil]]], :expressions {:pivot-grouping [:abs 1]}}, :type :query, :pivot-rows [0], :async? true}, :native {:query "SELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC", :params nil}, :status :failed, :class org.h2.jdbc.JdbcSQLException, :stacktrace ["org.h2.message.DbException.getJdbcSQLException(DbException.java:357)" "org.h2.message.DbException.get(DbException.java:179)" "org.h2.message.DbException.get(DbException.java:155)" "org.h2.expression.ExpressionColumn.updateAggregate(ExpressionColumn.java:172)" "org.h2.command.dml.Select.queryGroup(Select.java:350)" "org.h2.command.dml.Select.queryWithoutCache(Select.java:628)" "org.h2.command.dml.Query.queryWithoutCacheLazyCheck(Query.java:114)" "org.h2.command.dml.Query.query(Query.java:371)" "org.h2.command.dml.Query.query(Query.java:333)" "org.h2.command.CommandContainer.query(CommandContainer.java:114)" "org.h2.command.Command.executeQuery(Command.java:202)" "org.h2.jdbc.JdbcStatement.executeInternal(JdbcStatement.java:227)" "org.h2.jdbc.JdbcStatement.execute(JdbcStatement.java:205)" "com.mchange.v2.c3p0.impl.NewProxyStatement.execute(NewProxyStatement.java:75)" "--> driver.sql_jdbc.execute$fn__80722.invokeStatic(execute.clj:344)" "driver.sql_jdbc.execute$fn__80722.invoke(execute.clj:342)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invokeStatic(execute.clj:352)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invoke(execute.clj:349)" "driver.sql_jdbc.execute$execute_reducible_query$fn__80802.invoke(execute.clj:478)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38255$fn__38258.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38255.invoke(reducible.clj:134)" "query_processor.pivot$process_query_append_results.invokeStatic(pivot.clj:131)" "query_processor.pivot$process_query_append_results.invoke(pivot.clj:118)" "query_processor.pivot$process_queries_append_results$fn__55728.invoke(pivot.clj:142)" "query_processor.pivot$process_queries_append_results.invokeStatic(pivot.clj:140)" "query_processor.pivot$process_queries_append_results.invoke(pivot.clj:137)" "query_processor.pivot$append_queries_context$fn__55732$fn__55733$fn__55734.invoke(pivot.clj:157)" "query_processor.middleware.process_userland_query$add_and_save_execution_info_xform_BANG_$execution_info_rf_STAR___49055.invoke(process_userland_query.clj:87)" "query_processor.middleware.results_metadata$insights_xform$combine__49947.invoke(results_metadata.clj:131)" "query_processor.reducible$combine_additional_reducing_fns$fn__38269.invoke(reducible.clj:206)" "query_processor.middleware.add_rows_truncated$add_rows_truncated_xform$fn__45162.invoke(add_rows_truncated.clj:20)" "query_processor.middleware.format_rows$format_rows_xform$fn__47812.invoke(format_rows.clj:65)" "query_processor.middleware.limit$limit_xform$fn__47871.invoke(limit.clj:21)" "query_processor.context.default$default_reducef$fn__37435.invoke(default.clj:58)" "query_processor.context.default$default_reducef.invokeStatic(default.clj:57)" "query_processor.context.default$default_reducef.invoke(default.clj:48)" "query_processor.context$reducef.invokeStatic(context.clj:69)" "query_processor.context$reducef.invoke(context.clj:62)" "query_processor.context.default$default_runf$respond_STAR___37439.invoke(default.clj:69)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:485)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246$fn__38249.invoke(reducible.clj:108)"], :context :ad-hoc, :error "Column \"\"\"source\"\".CATEGORY\" must be in the GROUP BY list; SQL statement:\n-- Metabase:: userID: 1 queryType: MBQL queryHash: 6ea39a4dd0b199248a7b337cf22b368a2829f4917bf45b1a2526f624714f9ccc\nSELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC [90016-197]", :row_count 0, :running_time 0, :preprocessed {:database 4, :query {:source-table 10, :aggregation [[:aggregation-options [:count] {:name "count"}]], :breakout [[:expression "pivot-grouping"]], :order-by [[:asc [:field 107 nil]] [:asc [:expression "pivot-grouping"]]], :expressions {:pivot-grouping [:abs 1]}}, :type :query, :pivot-rows [0], :async? true, :info {:executed-by 1, :context :ad-hoc, :query-hash [110, -93, -102, 77, -48, -79, -103, 36, -118, 123, 51, 124, -14, 43, 54, -118, 40, 41, -12, -111, 123, -12, 91, 26, 37, 38, -10, 36, 113, 79, -100, -52]}}, :data {:rows [], :cols []}} 2021-07-24 13:08:36,740 DEBUG middleware.log :: POST /api/dataset/pivot 202 [ASYNC: completed] 2.2 s (17 DB calls) App DB connections: 0/7 Jetty threads: 3/50 (4 idle, 0 queued) (68 total active threads) Queries in flight: 0 (0 queued); h2 DB 4 connections: 1/2 (0 threads blocked) ``` </details> **Information about your Metabase Installation:** Tested 0.38.0 thru 0.40.1
1.0
Pivot Table errors if question is sorted by a breakout column - **Describe the bug** Pivot Table errors if question is sorted by a breakout/group-by column. It works if grouping by aggregation/metric. Workaround is to not have sorting on the initial question, but apply sorting in the visualization settings sidebar for Pivot Table. **To Reproduce** 1. Custom question > Sample Dataset > Products 2. Summarize Count by Category, and Sort by Category - visualize ![image](https://user-images.githubusercontent.com/1447303/126866647-da335758-f38c-446e-8a42-ef8e8fa68dfb.png) 3. Visualization > select Pivot Table - query fails with `Column """source"".CATEGORY" must be in the GROUP BY list` (or `ERROR: column "source.name" must appear in the GROUP BY clause or be used in an aggregate function` depending on database type) <details><summary>Full stacktrace</summary> ``` 2021-07-24 13:08:36,734 ERROR middleware.catch-exceptions :: Error processing query: null {:database_id 4, :started_at #t "2021-07-24T13:08:34.653521+02:00[Europe/Copenhagen]", :via [{:status :failed, :class clojure.lang.ExceptionInfo, :error "Error executing query", :stacktrace ["--> driver.sql_jdbc.execute$execute_reducible_query$fn__80802.invoke(execute.clj:480)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38255$fn__38258.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38255.invoke(reducible.clj:134)" "query_processor.pivot$process_query_append_results.invokeStatic(pivot.clj:131)" "query_processor.pivot$process_query_append_results.invoke(pivot.clj:118)" "query_processor.pivot$process_queries_append_results$fn__55728.invoke(pivot.clj:142)" "query_processor.pivot$process_queries_append_results.invokeStatic(pivot.clj:140)" "query_processor.pivot$process_queries_append_results.invoke(pivot.clj:137)" "query_processor.pivot$append_queries_context$fn__55732$fn__55733$fn__55734.invoke(pivot.clj:157)" "query_processor.middleware.process_userland_query$add_and_save_execution_info_xform_BANG_$execution_info_rf_STAR___49055.invoke(process_userland_query.clj:87)" "query_processor.middleware.results_metadata$insights_xform$combine__49947.invoke(results_metadata.clj:131)" "query_processor.reducible$combine_additional_reducing_fns$fn__38269.invoke(reducible.clj:206)" "query_processor.middleware.add_rows_truncated$add_rows_truncated_xform$fn__45162.invoke(add_rows_truncated.clj:20)" "query_processor.middleware.format_rows$format_rows_xform$fn__47812.invoke(format_rows.clj:65)" "query_processor.middleware.limit$limit_xform$fn__47871.invoke(limit.clj:21)" "query_processor.context.default$default_reducef$fn__37435.invoke(default.clj:58)" "query_processor.context.default$default_reducef.invokeStatic(default.clj:57)" "query_processor.context.default$default_reducef.invoke(default.clj:48)" "query_processor.context$reducef.invokeStatic(context.clj:69)" "query_processor.context$reducef.invoke(context.clj:62)" "query_processor.context.default$default_runf$respond_STAR___37439.invoke(default.clj:69)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:485)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246$fn__38249.invoke(reducible.clj:108)"], :error_type :invalid-query, :ex-data {:sql "-- Metabase:: userID: 1 queryType: MBQL queryHash: 6ea39a4dd0b199248a7b337cf22b368a2829f4917bf45b1a2526f624714f9ccc\nSELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC", :params nil, :type :invalid-query}}], :state "90016", :error_type :invalid-query, :json_query {:database 4, :query {:source-table 10, :aggregation [[:count]], :breakout ([:expression "pivot-grouping"]), :order-by [[:asc [:field 107 nil]]], :expressions {:pivot-grouping [:abs 1]}}, :type :query, :pivot-rows [0], :async? true}, :native {:query "SELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC", :params nil}, :status :failed, :class org.h2.jdbc.JdbcSQLException, :stacktrace ["org.h2.message.DbException.getJdbcSQLException(DbException.java:357)" "org.h2.message.DbException.get(DbException.java:179)" "org.h2.message.DbException.get(DbException.java:155)" "org.h2.expression.ExpressionColumn.updateAggregate(ExpressionColumn.java:172)" "org.h2.command.dml.Select.queryGroup(Select.java:350)" "org.h2.command.dml.Select.queryWithoutCache(Select.java:628)" "org.h2.command.dml.Query.queryWithoutCacheLazyCheck(Query.java:114)" "org.h2.command.dml.Query.query(Query.java:371)" "org.h2.command.dml.Query.query(Query.java:333)" "org.h2.command.CommandContainer.query(CommandContainer.java:114)" "org.h2.command.Command.executeQuery(Command.java:202)" "org.h2.jdbc.JdbcStatement.executeInternal(JdbcStatement.java:227)" "org.h2.jdbc.JdbcStatement.execute(JdbcStatement.java:205)" "com.mchange.v2.c3p0.impl.NewProxyStatement.execute(NewProxyStatement.java:75)" "--> driver.sql_jdbc.execute$fn__80722.invokeStatic(execute.clj:344)" "driver.sql_jdbc.execute$fn__80722.invoke(execute.clj:342)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invokeStatic(execute.clj:352)" "driver.sql_jdbc.execute$execute_statement_or_prepared_statement_BANG_.invoke(execute.clj:349)" "driver.sql_jdbc.execute$execute_reducible_query$fn__80802.invoke(execute.clj:478)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:477)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246.invoke(reducible.clj:109)" "query_processor.reducible$sync_qp$qp_STAR___38255$fn__38258.invoke(reducible.clj:135)" "query_processor.reducible$sync_qp$qp_STAR___38255.invoke(reducible.clj:134)" "query_processor.pivot$process_query_append_results.invokeStatic(pivot.clj:131)" "query_processor.pivot$process_query_append_results.invoke(pivot.clj:118)" "query_processor.pivot$process_queries_append_results$fn__55728.invoke(pivot.clj:142)" "query_processor.pivot$process_queries_append_results.invokeStatic(pivot.clj:140)" "query_processor.pivot$process_queries_append_results.invoke(pivot.clj:137)" "query_processor.pivot$append_queries_context$fn__55732$fn__55733$fn__55734.invoke(pivot.clj:157)" "query_processor.middleware.process_userland_query$add_and_save_execution_info_xform_BANG_$execution_info_rf_STAR___49055.invoke(process_userland_query.clj:87)" "query_processor.middleware.results_metadata$insights_xform$combine__49947.invoke(results_metadata.clj:131)" "query_processor.reducible$combine_additional_reducing_fns$fn__38269.invoke(reducible.clj:206)" "query_processor.middleware.add_rows_truncated$add_rows_truncated_xform$fn__45162.invoke(add_rows_truncated.clj:20)" "query_processor.middleware.format_rows$format_rows_xform$fn__47812.invoke(format_rows.clj:65)" "query_processor.middleware.limit$limit_xform$fn__47871.invoke(limit.clj:21)" "query_processor.context.default$default_reducef$fn__37435.invoke(default.clj:58)" "query_processor.context.default$default_reducef.invokeStatic(default.clj:57)" "query_processor.context.default$default_reducef.invoke(default.clj:48)" "query_processor.context$reducef.invokeStatic(context.clj:69)" "query_processor.context$reducef.invoke(context.clj:62)" "query_processor.context.default$default_runf$respond_STAR___37439.invoke(default.clj:69)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:485)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:472)" "driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:463)" "driver.sql_jdbc$fn__82291.invokeStatic(sql_jdbc.clj:54)" "driver.sql_jdbc$fn__82291.invoke(sql_jdbc.clj:52)" "driver.h2$fn__81043.invokeStatic(h2.clj:83)" "driver.h2$fn__81043.invoke(h2.clj:80)" "query_processor.context$executef.invokeStatic(context.clj:59)" "query_processor.context$executef.invoke(context.clj:48)" "query_processor.context.default$default_runf.invokeStatic(default.clj:68)" "query_processor.context.default$default_runf.invoke(default.clj:66)" "query_processor.context$runf.invokeStatic(context.clj:45)" "query_processor.context$runf.invoke(context.clj:39)" "query_processor.reducible$pivot.invokeStatic(reducible.clj:34)" "query_processor.reducible$pivot.invoke(reducible.clj:31)" "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47888.invoke(mbql_to_native.clj:25)" "query_processor.middleware.check_features$check_features$fn__47002.invoke(check_features.clj:39)" "query_processor.middleware.limit$limit$fn__47874.invoke(limit.clj:37)" "query_processor.middleware.cache$maybe_return_cached_results$fn__46454.invoke(cache.clj:204)" "query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__48134.invoke(optimize_temporal_filters.clj:204)" "query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__50066.invoke(validate_temporal_bucketing.clj:50)" "query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45573.invoke(auto_parse_filter_values.clj:43)" "query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41707.invoke(wrap_value_literals.clj:161)" "query_processor.middleware.annotate$add_column_info$fn__41582.invoke(annotate.clj:608)" "query_processor.middleware.permissions$check_query_permissions$fn__46874.invoke(permissions.clj:81)" "query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48995.invoke(pre_alias_aggregations.clj:40)" "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__47075.invoke(cumulative_aggregations.clj:60)" "query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__49292.invoke(resolve_joined_fields.clj:102)" "query_processor.middleware.resolve_joins$resolve_joins$fn__49605.invoke(resolve_joins.clj:171)" "query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__45149.invoke(add_implicit_joins.clj:190)" "query_processor.middleware.large_int_id$convert_id_to_string$fn__47838.invoke(large_int_id.clj:59)" "query_processor.middleware.format_rows$format_rows$fn__47819.invoke(format_rows.clj:74)" "query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__44443.invoke(add_default_temporal_unit.clj:23)" "query_processor.middleware.desugar$desugar$fn__47141.invoke(desugar.clj:21)" "query_processor.middleware.binning$update_binning_strategy$fn__45960.invoke(binning.clj:229)" "query_processor.middleware.resolve_fields$resolve_fields$fn__46677.invoke(resolve_fields.clj:34)" "query_processor.middleware.add_dimension_projections$add_remapping$fn__44798.invoke(add_dimension_projections.clj:314)" "query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__45027.invoke(add_implicit_clauses.clj:147)" "query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__50015.invoke(upgrade_field_literals.clj:40)" "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__45312.invoke(add_source_metadata.clj:123)" "query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__49167.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)" "query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__45520.invoke(auto_bucket_datetimes.clj:147)" "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46724.invoke(resolve_source_table.clj:45)" "query_processor.middleware.parameters$substitute_parameters$fn__48977.invoke(parameters.clj:111)" "query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46776.invoke(resolve_referenced.clj:79)" "query_processor.middleware.expand_macros$expand_macros$fn__47525.invoke(expand_macros.clj:184)" "query_processor.middleware.add_timezone_info$add_timezone_info$fn__45321.invoke(add_timezone_info.clj:15)" "query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49968.invoke(splice_params_in_response.clj:32)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178$fn__49182.invoke(resolve_database_and_driver.clj:31)" "driver$do_with_driver.invokeStatic(driver.clj:60)" "driver$do_with_driver.invoke(driver.clj:56)" "query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__49178.invoke(resolve_database_and_driver.clj:25)" "query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47765.invoke(fetch_source_query.clj:274)" "query_processor.middleware.store$initialize_store$fn__49977$fn__49978.invoke(store.clj:11)" "query_processor.store$do_with_store.invokeStatic(store.clj:42)" "query_processor.store$do_with_store.invoke(store.clj:38)" "query_processor.middleware.store$initialize_store$fn__49977.invoke(store.clj:10)" "query_processor.middleware.validate$validate_query$fn__50022.invoke(validate.clj:10)" "query_processor.middleware.normalize_query$normalize$fn__47901.invoke(normalize_query.clj:22)" "query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__45167.invoke(add_rows_truncated.clj:35)" "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49953.invoke(results_metadata.clj:147)" "query_processor.middleware.constraints$add_default_userland_constraints$fn__47018.invoke(constraints.clj:42)" "query_processor.middleware.process_userland_query$process_userland_query$fn__49064.invoke(process_userland_query.clj:134)" "query_processor.middleware.catch_exceptions$catch_exceptions$fn__46958.invoke(catch_exceptions.clj:173)" "query_processor.reducible$async_qp$qp_STAR___38246$thunk__38247.invoke(reducible.clj:103)" "query_processor.reducible$async_qp$qp_STAR___38246$fn__38249.invoke(reducible.clj:108)"], :context :ad-hoc, :error "Column \"\"\"source\"\".CATEGORY\" must be in the GROUP BY list; SQL statement:\n-- Metabase:: userID: 1 queryType: MBQL queryHash: 6ea39a4dd0b199248a7b337cf22b368a2829f4917bf45b1a2526f624714f9ccc\nSELECT \"source\".\"pivot-grouping\" AS \"pivot-grouping\", count(*) AS \"count\" FROM (SELECT abs(1) AS \"pivot-grouping\", \"PUBLIC\".\"PRODUCTS\".\"CATEGORY\" AS \"CATEGORY\" FROM \"PUBLIC\".\"PRODUCTS\") \"source\" GROUP BY \"source\".\"pivot-grouping\" ORDER BY \"source\".\"CATEGORY\" ASC, \"source\".\"pivot-grouping\" ASC [90016-197]", :row_count 0, :running_time 0, :preprocessed {:database 4, :query {:source-table 10, :aggregation [[:aggregation-options [:count] {:name "count"}]], :breakout [[:expression "pivot-grouping"]], :order-by [[:asc [:field 107 nil]] [:asc [:expression "pivot-grouping"]]], :expressions {:pivot-grouping [:abs 1]}}, :type :query, :pivot-rows [0], :async? true, :info {:executed-by 1, :context :ad-hoc, :query-hash [110, -93, -102, 77, -48, -79, -103, 36, -118, 123, 51, 124, -14, 43, 54, -118, 40, 41, -12, -111, 123, -12, 91, 26, 37, 38, -10, 36, 113, 79, -100, -52]}}, :data {:rows [], :cols []}} 2021-07-24 13:08:36,740 DEBUG middleware.log :: POST /api/dataset/pivot 202 [ASYNC: completed] 2.2 s (17 DB calls) App DB connections: 0/7 Jetty threads: 3/50 (4 idle, 0 queued) (68 total active threads) Queries in flight: 0 (0 queued); h2 DB 4 connections: 1/2 (0 threads blocked) ``` </details> **Information about your Metabase Installation:** Tested 0.38.0 thru 0.40.1
process
pivot table errors if question is sorted by a breakout column describe the bug pivot table errors if question is sorted by a breakout group by column it works if grouping by aggregation metric workaround is to not have sorting on the initial question but apply sorting in the visualization settings sidebar for pivot table to reproduce custom question sample dataset products summarize count by category and sort by category visualize visualization select pivot table query fails with column source category must be in the group by list or error column source name must appear in the group by clause or be used in an aggregate function depending on database type full stacktrace error middleware catch exceptions error processing query null database id started at t via status failed class clojure lang exceptioninfo error error executing query stacktrace driver sql jdbc execute execute reducible query fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver fn invokestatic clj driver fn invoke clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor pivot process query append results invokestatic pivot clj query processor pivot process query append results invoke pivot clj query processor pivot process queries append results fn invoke pivot clj query processor pivot process queries append results invokestatic pivot clj query processor pivot process queries append results invoke pivot clj query processor pivot append queries context fn fn fn invoke pivot clj query processor middleware process userland query add and save execution info xform bang execution info rf star invoke process userland query clj query processor middleware results metadata insights xform combine invoke results metadata clj query processor reducible combine additional reducing fns fn invoke reducible clj query processor middleware add rows truncated add rows truncated xform fn invoke add rows truncated clj query processor middleware format rows format rows xform fn invoke format rows clj query processor middleware limit limit xform fn invoke limit clj query processor context default default reducef fn invoke default clj query processor context default default reducef invokestatic default clj query processor context default default reducef invoke default clj query processor context reducef invokestatic context clj query processor context reducef invoke context clj query processor context default default runf respond star invoke default clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver fn invokestatic clj driver fn invoke clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star fn invoke reducible clj error type invalid query ex data sql metabase userid querytype mbql queryhash nselect source pivot grouping as pivot grouping count as count from select abs as pivot grouping public products category as category from public products source group by source pivot grouping order by source category asc source pivot grouping asc params nil type invalid query state error type invalid query json query database query source table aggregation breakout order by expressions pivot grouping type query pivot rows async true native query select source pivot grouping as pivot grouping count as count from select abs as pivot grouping public products category as category from public products source group by source pivot grouping order by source category asc source pivot grouping asc params nil status failed class org jdbc jdbcsqlexception stacktrace org message dbexception getjdbcsqlexception dbexception java org message dbexception get dbexception java org message dbexception get dbexception java org expression expressioncolumn updateaggregate expressioncolumn java org command dml select querygroup select java org command dml select querywithoutcache select java org command dml query querywithoutcachelazycheck query java org command dml query query query java org command dml query query query java org command commandcontainer query commandcontainer java org command command executequery command java org jdbc jdbcstatement executeinternal jdbcstatement java org jdbc jdbcstatement execute jdbcstatement java com mchange impl newproxystatement execute newproxystatement java driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute execute statement or prepared statement bang invokestatic execute clj driver sql jdbc execute execute statement or prepared statement bang invoke execute clj driver sql jdbc execute execute reducible query fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver fn invokestatic clj driver fn invoke clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor pivot process query append results invokestatic pivot clj query processor pivot process query append results invoke pivot clj query processor pivot process queries append results fn invoke pivot clj query processor pivot process queries append results invokestatic pivot clj query processor pivot process queries append results invoke pivot clj query processor pivot append queries context fn fn fn invoke pivot clj query processor middleware process userland query add and save execution info xform bang execution info rf star invoke process userland query clj query processor middleware results metadata insights xform combine invoke results metadata clj query processor reducible combine additional reducing fns fn invoke reducible clj query processor middleware add rows truncated add rows truncated xform fn invoke add rows truncated clj query processor middleware format rows format rows xform fn invoke format rows clj query processor middleware limit limit xform fn invoke limit clj query processor context default default reducef fn invoke default clj query processor context default default reducef invokestatic default clj query processor context default default reducef invoke default clj query processor context reducef invokestatic context clj query processor context reducef invoke context clj query processor context default default runf respond star invoke default clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver fn invokestatic clj driver fn invoke clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star fn invoke reducible clj context ad hoc error column source category must be in the group by list sql statement n metabase userid querytype mbql queryhash nselect source pivot grouping as pivot grouping count as count from select abs as pivot grouping public products category as category from public products source group by source pivot grouping order by source category asc source pivot grouping asc row count running time preprocessed database query source table aggregation name count breakout order by expressions pivot grouping type query pivot rows async true info executed by context ad hoc query hash data rows cols debug middleware log post api dataset pivot s db calls app db connections jetty threads idle queued total active threads queries in flight queued db connections threads blocked information about your metabase installation tested thru
1
119,993
25,719,976,483
IssuesEvent
2022-12-07 13:03:10
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Devirtualization issues in non-shared generics
tenet-performance area-CodeGen-coreclr
During benchmarking an optimization involving `EqualityComparer<T>.Default` I found an issue - if the comparer is used in a generic type devirtualization doesn't happen for some cases. In the example below only `decimal` comparison is inlined well in case of generics, but not for `long`. At the same the JIT heavily optimized the code for `long` comparison in a non-generic code. ### Generic ```csharp [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public abstract class Benchmark<T> { private static IEqualityComparer<T> s_defaultComparer = EqualityComparer<T>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(T value) => EqualityComparer<T>.Default.Equals(default, value); public IEnumerable<object> Values() { foreach (var value in ValuesCore()) yield return value; } protected abstract IEnumerable<T> ValuesCore(); } public class Int64Benchmark : Benchmark<long> { protected override IEnumerable<long> ValuesCore() => new[] { default, 2L }; } public class DecimalBenchmark : Benchmark<decimal> { protected override IEnumerable<decimal> ValuesCore() => new[] { default, 2.7M }; } ``` #### Int64Benchmark ```assembly ; Json.Benchmark`1[[System.Int64, System.Private.CoreLib]].IsDefaultInlined(Int64) sub rsp,28 mov rcx,14A100059B8 mov rcx,[rcx] mov r8,rdx xor edx,edx cmp [rcx],ecx call qword ptr [7FFC9C55F3D0] nop add rsp,28 ret ; Total bytes of code 36 ; System.Collections.Generic.GenericEqualityComparer`1[[System.Int64, System.Private.CoreLib]].Equals(Int64, Int64) cmp rdx,r8 sete al movzx eax,al ret ; Total bytes of code 10 ``` #### DecimalBenchmark ```assembly ; Json.Benchmark`1[[System.Decimal, System.Private.CoreLib]].IsDefaultInlined(System.Decimal) sub rsp,58 vzeroupper vxorps xmm0,xmm0,xmm0 vmovdqu xmmword ptr [rsp+48],xmm0 vmovdqu xmm0,xmmword ptr [rdx] vmovdqu xmmword ptr [rsp+38],xmm0 vmovupd xmm0,[rsp+38] vmovupd [rsp+28],xmm0 lea rcx,[rsp+48] lea rdx,[rsp+28] call System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) test eax,eax sete al movzx eax,al add rsp,58 ret ; Total bytes of code 67 ; System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) mov eax,[rdx+8] or eax,[rdx+0C] or eax,[rdx+4] jne short M01_L01 mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L00 xor eax,eax ret M01_L00: mov eax,[rcx] sar eax,1F or eax,1 jmp short M01_L03 M01_L01: mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L02 mov ecx,[rdx] sar ecx,1F or ecx,1 mov eax,ecx neg eax jmp short M01_L03 M01_L02: mov eax,[rcx] sar eax,1F mov r8d,[rdx] sar r8d,1F sub eax,r8d test eax,eax je short M01_L04 M01_L03: ret M01_L04: jmp near ptr System.Decimal+DecCalc.VarDecCmpSub(System.Decimal ByRef, System.Decimal ByRef) ; Total bytes of code 85 ``` ### Non generic ```csharp [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public class Int64BenchmarkNonGeneric { private static IEqualityComparer<long> s_defaultComparer = EqualityComparer<long>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(long value) => EqualityComparer<long>.Default.Equals(default, value); [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultStored(long value) => s_defaultComparer.Equals(default, value); public IEnumerable<object> Values() => new object[] { 0L, 2L }; } [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public class DecimalBenchmarkNonGeneric { private static IEqualityComparer<decimal> s_defaultComparer = EqualityComparer<decimal>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(decimal value) => EqualityComparer<decimal>.Default.Equals(default, value); [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultStored(decimal value) => s_defaultComparer.Equals(default, value); public IEnumerable<object> Values() => new object[] { 0M, 2M }; } ``` #### Int64BenchmarkNonGeneric ```assembly ; Json.Int64BenchmarkNonGeneric.IsDefaultInlined(Int64) test rdx,rdx sete al movzx eax,al ret ; Total bytes of code 10 ``` #### DecimalBenchmarkNonGeneric ```assembly ; Json.DecimalBenchmarkNonGeneric.IsDefaultInlined(System.Decimal) sub rsp,58 vzeroupper mov rcx,1F4BD271050 mov rcx,[rcx] vmovdqu xmm0,xmmword ptr [rcx+8] vmovdqu xmmword ptr [rsp+48],xmm0 vmovdqu xmm0,xmmword ptr [rdx] vmovdqu xmmword ptr [rsp+38],xmm0 vmovupd xmm0,[rsp+38] vmovupd [rsp+28],xmm0 lea rcx,[rsp+48] lea rdx,[rsp+28] call System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) test eax,eax sete al movzx eax,al add rsp,58 ret ; Total bytes of code 81 ; System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) mov eax,[rdx+8] or eax,[rdx+0C] or eax,[rdx+4] jne short M01_L01 mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L00 xor eax,eax ret M01_L00: mov eax,[rcx] sar eax,1F or eax,1 jmp short M01_L03 M01_L01: mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L02 mov ecx,[rdx] sar ecx,1F or ecx,1 mov eax,ecx neg eax jmp short M01_L03 M01_L02: mov eax,[rcx] sar eax,1F mov r8d,[rdx] sar r8d,1F sub eax,r8d test eax,eax je short M01_L04 M01_L03: ret M01_L04: jmp near ptr System.Decimal+DecCalc.VarDecCmpSub(System.Decimal ByRef, System.Decimal ByRef) ; Total bytes of code 85 ``` /cc @AndyAyersMS category:cq theme:devirtualization skill-level:expert cost:large
1.0
Devirtualization issues in non-shared generics - During benchmarking an optimization involving `EqualityComparer<T>.Default` I found an issue - if the comparer is used in a generic type devirtualization doesn't happen for some cases. In the example below only `decimal` comparison is inlined well in case of generics, but not for `long`. At the same the JIT heavily optimized the code for `long` comparison in a non-generic code. ### Generic ```csharp [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public abstract class Benchmark<T> { private static IEqualityComparer<T> s_defaultComparer = EqualityComparer<T>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(T value) => EqualityComparer<T>.Default.Equals(default, value); public IEnumerable<object> Values() { foreach (var value in ValuesCore()) yield return value; } protected abstract IEnumerable<T> ValuesCore(); } public class Int64Benchmark : Benchmark<long> { protected override IEnumerable<long> ValuesCore() => new[] { default, 2L }; } public class DecimalBenchmark : Benchmark<decimal> { protected override IEnumerable<decimal> ValuesCore() => new[] { default, 2.7M }; } ``` #### Int64Benchmark ```assembly ; Json.Benchmark`1[[System.Int64, System.Private.CoreLib]].IsDefaultInlined(Int64) sub rsp,28 mov rcx,14A100059B8 mov rcx,[rcx] mov r8,rdx xor edx,edx cmp [rcx],ecx call qword ptr [7FFC9C55F3D0] nop add rsp,28 ret ; Total bytes of code 36 ; System.Collections.Generic.GenericEqualityComparer`1[[System.Int64, System.Private.CoreLib]].Equals(Int64, Int64) cmp rdx,r8 sete al movzx eax,al ret ; Total bytes of code 10 ``` #### DecimalBenchmark ```assembly ; Json.Benchmark`1[[System.Decimal, System.Private.CoreLib]].IsDefaultInlined(System.Decimal) sub rsp,58 vzeroupper vxorps xmm0,xmm0,xmm0 vmovdqu xmmword ptr [rsp+48],xmm0 vmovdqu xmm0,xmmword ptr [rdx] vmovdqu xmmword ptr [rsp+38],xmm0 vmovupd xmm0,[rsp+38] vmovupd [rsp+28],xmm0 lea rcx,[rsp+48] lea rdx,[rsp+28] call System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) test eax,eax sete al movzx eax,al add rsp,58 ret ; Total bytes of code 67 ; System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) mov eax,[rdx+8] or eax,[rdx+0C] or eax,[rdx+4] jne short M01_L01 mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L00 xor eax,eax ret M01_L00: mov eax,[rcx] sar eax,1F or eax,1 jmp short M01_L03 M01_L01: mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L02 mov ecx,[rdx] sar ecx,1F or ecx,1 mov eax,ecx neg eax jmp short M01_L03 M01_L02: mov eax,[rcx] sar eax,1F mov r8d,[rdx] sar r8d,1F sub eax,r8d test eax,eax je short M01_L04 M01_L03: ret M01_L04: jmp near ptr System.Decimal+DecCalc.VarDecCmpSub(System.Decimal ByRef, System.Decimal ByRef) ; Total bytes of code 85 ``` ### Non generic ```csharp [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public class Int64BenchmarkNonGeneric { private static IEqualityComparer<long> s_defaultComparer = EqualityComparer<long>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(long value) => EqualityComparer<long>.Default.Equals(default, value); [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultStored(long value) => s_defaultComparer.Equals(default, value); public IEnumerable<object> Values() => new object[] { 0L, 2L }; } [SimpleJob(RuntimeMoniker.NetCoreApp50)] [DisassemblyDiagnoser] public class DecimalBenchmarkNonGeneric { private static IEqualityComparer<decimal> s_defaultComparer = EqualityComparer<decimal>.Default; [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultInlined(decimal value) => EqualityComparer<decimal>.Default.Equals(default, value); [Benchmark] [ArgumentsSource(nameof(Values))] public bool IsDefaultStored(decimal value) => s_defaultComparer.Equals(default, value); public IEnumerable<object> Values() => new object[] { 0M, 2M }; } ``` #### Int64BenchmarkNonGeneric ```assembly ; Json.Int64BenchmarkNonGeneric.IsDefaultInlined(Int64) test rdx,rdx sete al movzx eax,al ret ; Total bytes of code 10 ``` #### DecimalBenchmarkNonGeneric ```assembly ; Json.DecimalBenchmarkNonGeneric.IsDefaultInlined(System.Decimal) sub rsp,58 vzeroupper mov rcx,1F4BD271050 mov rcx,[rcx] vmovdqu xmm0,xmmword ptr [rcx+8] vmovdqu xmmword ptr [rsp+48],xmm0 vmovdqu xmm0,xmmword ptr [rdx] vmovdqu xmmword ptr [rsp+38],xmm0 vmovupd xmm0,[rsp+38] vmovupd [rsp+28],xmm0 lea rcx,[rsp+48] lea rdx,[rsp+28] call System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) test eax,eax sete al movzx eax,al add rsp,58 ret ; Total bytes of code 81 ; System.Decimal+DecCalc.VarDecCmp(System.Decimal ByRef, System.Decimal ByRef) mov eax,[rdx+8] or eax,[rdx+0C] or eax,[rdx+4] jne short M01_L01 mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L00 xor eax,eax ret M01_L00: mov eax,[rcx] sar eax,1F or eax,1 jmp short M01_L03 M01_L01: mov eax,[rcx+8] or eax,[rcx+0C] or eax,[rcx+4] jne short M01_L02 mov ecx,[rdx] sar ecx,1F or ecx,1 mov eax,ecx neg eax jmp short M01_L03 M01_L02: mov eax,[rcx] sar eax,1F mov r8d,[rdx] sar r8d,1F sub eax,r8d test eax,eax je short M01_L04 M01_L03: ret M01_L04: jmp near ptr System.Decimal+DecCalc.VarDecCmpSub(System.Decimal ByRef, System.Decimal ByRef) ; Total bytes of code 85 ``` /cc @AndyAyersMS category:cq theme:devirtualization skill-level:expert cost:large
non_process
devirtualization issues in non shared generics during benchmarking an optimization involving equalitycomparer default i found an issue if the comparer is used in a generic type devirtualization doesn t happen for some cases in the example below only decimal comparison is inlined well in case of generics but not for long at the same the jit heavily optimized the code for long comparison in a non generic code generic csharp public abstract class benchmark private static iequalitycomparer s defaultcomparer equalitycomparer default public bool isdefaultinlined t value equalitycomparer default equals default value public ienumerable values foreach var value in valuescore yield return value protected abstract ienumerable valuescore public class benchmark protected override ienumerable valuescore new default public class decimalbenchmark benchmark protected override ienumerable valuescore new default assembly json benchmark isdefaultinlined sub rsp mov rcx mov rcx mov rdx xor edx edx cmp ecx call qword ptr nop add rsp ret total bytes of code system collections generic genericequalitycomparer equals cmp rdx sete al movzx eax al ret total bytes of code decimalbenchmark assembly json benchmark isdefaultinlined system decimal sub rsp vzeroupper vxorps vmovdqu xmmword ptr vmovdqu xmmword ptr vmovdqu xmmword ptr vmovupd vmovupd lea rcx lea rdx call system decimal deccalc vardeccmp system decimal byref system decimal byref test eax eax sete al movzx eax al add rsp ret total bytes of code system decimal deccalc vardeccmp system decimal byref system decimal byref mov eax or eax or eax jne short mov eax or eax or eax jne short xor eax eax ret mov eax sar eax or eax jmp short mov eax or eax or eax jne short mov ecx sar ecx or ecx mov eax ecx neg eax jmp short mov eax sar eax mov sar sub eax test eax eax je short ret jmp near ptr system decimal deccalc vardeccmpsub system decimal byref system decimal byref total bytes of code non generic csharp public class private static iequalitycomparer s defaultcomparer equalitycomparer default public bool isdefaultinlined long value equalitycomparer default equals default value public bool isdefaultstored long value s defaultcomparer equals default value public ienumerable values new object public class decimalbenchmarknongeneric private static iequalitycomparer s defaultcomparer equalitycomparer default public bool isdefaultinlined decimal value equalitycomparer default equals default value public bool isdefaultstored decimal value s defaultcomparer equals default value public ienumerable values new object assembly json isdefaultinlined test rdx rdx sete al movzx eax al ret total bytes of code decimalbenchmarknongeneric assembly json decimalbenchmarknongeneric isdefaultinlined system decimal sub rsp vzeroupper mov rcx mov rcx vmovdqu xmmword ptr vmovdqu xmmword ptr vmovdqu xmmword ptr vmovdqu xmmword ptr vmovupd vmovupd lea rcx lea rdx call system decimal deccalc vardeccmp system decimal byref system decimal byref test eax eax sete al movzx eax al add rsp ret total bytes of code system decimal deccalc vardeccmp system decimal byref system decimal byref mov eax or eax or eax jne short mov eax or eax or eax jne short xor eax eax ret mov eax sar eax or eax jmp short mov eax or eax or eax jne short mov ecx sar ecx or ecx mov eax ecx neg eax jmp short mov eax sar eax mov sar sub eax test eax eax je short ret jmp near ptr system decimal deccalc vardeccmpsub system decimal byref system decimal byref total bytes of code cc andyayersms category cq theme devirtualization skill level expert cost large
0
16,825
22,060,958,766
IssuesEvent
2022-05-30 17:45:30
bitPogo/kmock
https://api.github.com/repos/bitPogo/kmock
closed
Make freezing behaviour globally configurable
enhancement kmock-processor kmock-gradle
## Description <!--- Provide a detailed introduction to the issue itself, and why you consider it to be a bug --> Currently `freeze` is always `true` by default, which can harm the dev experience even if it is not needed. Simple solution is to make it configurable via the Extension of the GradlePlugin. Acceptance Criteria: - The GradleExtension has a new field, which is true on default but can be switched to false - this value gets propagated to the Processor which uses it as default for the factories.
1.0
Make freezing behaviour globally configurable - ## Description <!--- Provide a detailed introduction to the issue itself, and why you consider it to be a bug --> Currently `freeze` is always `true` by default, which can harm the dev experience even if it is not needed. Simple solution is to make it configurable via the Extension of the GradlePlugin. Acceptance Criteria: - The GradleExtension has a new field, which is true on default but can be switched to false - this value gets propagated to the Processor which uses it as default for the factories.
process
make freezing behaviour globally configurable description currently freeze is always true by default which can harm the dev experience even if it is not needed simple solution is to make it configurable via the extension of the gradleplugin acceptance criteria the gradleextension has a new field which is true on default but can be switched to false this value gets propagated to the processor which uses it as default for the factories
1
5,559
8,402,790,295
IssuesEvent
2018-10-11 07:55:42
mozilla-tw/ScreenshotGo
https://api.github.com/repos/mozilla-tw/ScreenshotGo
closed
Enable/integrate Adjust
P0 process
to enable adjust for ScreenshotGo, Dev. will need a token with the right package name from Marketing.
1.0
Enable/integrate Adjust - to enable adjust for ScreenshotGo, Dev. will need a token with the right package name from Marketing.
process
enable integrate adjust to enable adjust for screenshotgo dev will need a token with the right package name from marketing
1
135,273
10,968,082,696
IssuesEvent
2019-11-28 10:50:43
imixs/imixs-workflow
https://api.github.com/repos/imixs/imixs-workflow
closed
Extend Health Service
enhancement testing
Extend the Health Service and provide the engine version. Also a model version count of 0 should also indicate UP status
1.0
Extend Health Service - Extend the Health Service and provide the engine version. Also a model version count of 0 should also indicate UP status
non_process
extend health service extend the health service and provide the engine version also a model version count of should also indicate up status
0
14,176
17,088,674,937
IssuesEvent
2021-07-08 14:47:08
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
Investigate and remove usage of `--forceExit`
kind/tech process/candidate team/client team/migrations tech/typescript topic: node-api topic: tests
- [ ] migrate - [x] E2E - [x] sdk - [x] client - [x] integration-tests - [x] workflows https://sourcegraph.com/search?q=context:global+repo:github.com/prisma/prisma+--forceExit+count:1000&patternType=literal https://sourcegraph.com/search?q=context:global+repo:%5Egithub%5C.com/prisma/e2e-tests%24+forceExit&patternType=literal Especially if introduced for Node-API, this is something we should try to get rid of again before GA.
1.0
Investigate and remove usage of `--forceExit` - - [ ] migrate - [x] E2E - [x] sdk - [x] client - [x] integration-tests - [x] workflows https://sourcegraph.com/search?q=context:global+repo:github.com/prisma/prisma+--forceExit+count:1000&patternType=literal https://sourcegraph.com/search?q=context:global+repo:%5Egithub%5C.com/prisma/e2e-tests%24+forceExit&patternType=literal Especially if introduced for Node-API, this is something we should try to get rid of again before GA.
process
investigate and remove usage of forceexit migrate sdk client integration tests workflows especially if introduced for node api this is something we should try to get rid of again before ga
1
16,238
20,792,829,944
IssuesEvent
2022-03-17 05:20:38
crim-ca/weaver
https://api.github.com/repos/crim-ca/weaver
closed
WPS hostname not replaced in XML response when behind server proxy
triage/bug triage/enhancement process/wps1 process/wps2
For example, calling : `${SERVER_HOSTNAME}/ows/wps?request=GetCapabilities&service=WPS` The returned XML document will have `https://localhost:4000` in place of `${SERVER_HOSTNAME}` instead of the actual server hostname, as defined by `weaver.url` configuration parameter.
2.0
WPS hostname not replaced in XML response when behind server proxy - For example, calling : `${SERVER_HOSTNAME}/ows/wps?request=GetCapabilities&service=WPS` The returned XML document will have `https://localhost:4000` in place of `${SERVER_HOSTNAME}` instead of the actual server hostname, as defined by `weaver.url` configuration parameter.
process
wps hostname not replaced in xml response when behind server proxy for example calling server hostname ows wps request getcapabilities service wps the returned xml document will have in place of server hostname instead of the actual server hostname as defined by weaver url configuration parameter
1
5,989
8,805,374,680
IssuesEvent
2018-12-26 19:14:01
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Preprocessing conref step does not fix up external links
bug preprocess/conref stale
https://dita-ot.github.io/1.8/dev_ref/preprocess-conref.html describes how links internal to the conreffed content are randomised to make them unique within the topic they get included in, and how internal links within the conreffed content are updated to match the new targets. That works fine, but this breaks when content external to the conreffed content links inside the conreffed content. The external links are not fixed up by the process and they are thus broken.
1.0
Preprocessing conref step does not fix up external links - https://dita-ot.github.io/1.8/dev_ref/preprocess-conref.html describes how links internal to the conreffed content are randomised to make them unique within the topic they get included in, and how internal links within the conreffed content are updated to match the new targets. That works fine, but this breaks when content external to the conreffed content links inside the conreffed content. The external links are not fixed up by the process and they are thus broken.
process
preprocessing conref step does not fix up external links describes how links internal to the conreffed content are randomised to make them unique within the topic they get included in and how internal links within the conreffed content are updated to match the new targets that works fine but this breaks when content external to the conreffed content links inside the conreffed content the external links are not fixed up by the process and they are thus broken
1
4,265
7,189,382,722
IssuesEvent
2018-02-02 13:51:15
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
blockScrape --check --deep
apps-blockScrape status-inprocess type-enhancement
**Consolidating notes from bug #31:** Currently deep check only checks that each block has the correct list of transactions hashes. It should actually decend into the transactions and then into the receipts and then into the logs to check all data fields from QuickBlocks to the node. From Great-Hill-Corporation/ethslurp#140 Also--pick up useful code from getBlocks --check. This command should write a report at the end and keeps stats such as XX items were marked as empty, but had transactions. YY items were marked as full but had zero transactions. And the --deep check should report some things such as that the mismatch between the quickBlocks block and the node's block.
1.0
blockScrape --check --deep - **Consolidating notes from bug #31:** Currently deep check only checks that each block has the correct list of transactions hashes. It should actually decend into the transactions and then into the receipts and then into the logs to check all data fields from QuickBlocks to the node. From Great-Hill-Corporation/ethslurp#140 Also--pick up useful code from getBlocks --check. This command should write a report at the end and keeps stats such as XX items were marked as empty, but had transactions. YY items were marked as full but had zero transactions. And the --deep check should report some things such as that the mismatch between the quickBlocks block and the node's block.
process
blockscrape check deep consolidating notes from bug currently deep check only checks that each block has the correct list of transactions hashes it should actually decend into the transactions and then into the receipts and then into the logs to check all data fields from quickblocks to the node from great hill corporation ethslurp also pick up useful code from getblocks check this command should write a report at the end and keeps stats such as xx items were marked as empty but had transactions yy items were marked as full but had zero transactions and the deep check should report some things such as that the mismatch between the quickblocks block and the node s block
1
9,333
12,340,717,091
IssuesEvent
2020-05-14 20:27:30
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
Click tests regarding `pointer-events:none` are duplicated
pkg/driver process: tests stage: needs review topic: cy.click 🖱
<!-- Is this a question? Questions WILL BE CLOSED. Ask in our chat https://on.cypress.io/chat --> ### Current behavior: Within `click_spec.js`, the tests for `pointer-events:none` are duplicated twice. Lines 885-961: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/commands/actions/click_spec.js#L885-L961 Lines 963-1039: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/commands/actions/click_spec.js#L963-L1039 ### Desired behavior: There is no need for these tests to be duplicated. ### Versions 4.5.0
1.0
Click tests regarding `pointer-events:none` are duplicated - <!-- Is this a question? Questions WILL BE CLOSED. Ask in our chat https://on.cypress.io/chat --> ### Current behavior: Within `click_spec.js`, the tests for `pointer-events:none` are duplicated twice. Lines 885-961: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/commands/actions/click_spec.js#L885-L961 Lines 963-1039: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/commands/actions/click_spec.js#L963-L1039 ### Desired behavior: There is no need for these tests to be duplicated. ### Versions 4.5.0
process
click tests regarding pointer events none are duplicated current behavior within click spec js the tests for pointer events none are duplicated twice lines lines desired behavior there is no need for these tests to be duplicated versions
1
13,236
15,706,356,956
IssuesEvent
2021-03-26 17:19:21
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
Issue when inserting/updating a float which value is 0.1
bug/2-confirmed kind/bug process/candidate team/client topic: floating point types
<!-- Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client. Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports --> ## Bug description <!-- A clear and concise description of what the bug is. --> It looks like Prisma is not handling the 0.1 value correctly when doing insertion or updates. ## How to reproduce Run a query inserting or updating a float: ```js prisma.myTable.create({ data: { someNumber: 0.1, }, }); ``` It will create a row where someNumber value is set to 0.0999999999999 instead of 0.1. _Note: Using a raw query works:_ ```js prisma.$queryRaw` INSERT INTO "prisma"."MyTable" ("someNumber") VALUES (${0.1}) `; ``` => someNumber will be set to 0.1. ## Expected behavior someNumber should be set to 0.1 instead of 0.0999999999999. ## Prisma information <!-- Your Prisma schema, Prisma Client queries, ... Do not include your database credentials when sharing your Prisma schema! --> ``` model MyTable { id String @id @default(dbgenerated()) someNumber Float? } ``` ## Environment & setup <!-- In which environment does the problem occur --> - OS: Mac OS - Database: PostgreSQL - Node.js version: v12.16.1 - Prisma version: v2.17.0
1.0
Issue when inserting/updating a float which value is 0.1 - <!-- Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client. Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports --> ## Bug description <!-- A clear and concise description of what the bug is. --> It looks like Prisma is not handling the 0.1 value correctly when doing insertion or updates. ## How to reproduce Run a query inserting or updating a float: ```js prisma.myTable.create({ data: { someNumber: 0.1, }, }); ``` It will create a row where someNumber value is set to 0.0999999999999 instead of 0.1. _Note: Using a raw query works:_ ```js prisma.$queryRaw` INSERT INTO "prisma"."MyTable" ("someNumber") VALUES (${0.1}) `; ``` => someNumber will be set to 0.1. ## Expected behavior someNumber should be set to 0.1 instead of 0.0999999999999. ## Prisma information <!-- Your Prisma schema, Prisma Client queries, ... Do not include your database credentials when sharing your Prisma schema! --> ``` model MyTable { id String @id @default(dbgenerated()) someNumber Float? } ``` ## Environment & setup <!-- In which environment does the problem occur --> - OS: Mac OS - Database: PostgreSQL - Node.js version: v12.16.1 - Prisma version: v2.17.0
process
issue when inserting updating a float which value is thanks for helping us improve prisma 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by setting the debug environment variable and enabling additional logging output in prisma client learn more about writing proper bug reports here bug description it looks like prisma is not handling the value correctly when doing insertion or updates how to reproduce run a query inserting or updating a float js prisma mytable create data somenumber it will create a row where somenumber value is set to instead of note using a raw query works js prisma queryraw insert into prisma mytable somenumber values somenumber will be set to expected behavior somenumber should be set to instead of prisma information your prisma schema prisma client queries do not include your database credentials when sharing your prisma schema model mytable id string id default dbgenerated somenumber float environment setup os mac os database postgresql node js version prisma version
1
167
2,586,290,367
IssuesEvent
2015-02-17 10:21:41
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Remove keydef processing from XHTML rendering code
feature P2 preprocess
Currently keys are (for the most part) resolved in one preprocessing step. For linking elements, all is good. For phrase elements, it pulls in text when needed. The XHTML step then does extra processing that probably belongs inside the keyref process. Specifically, for non-linking elements that can't take `@href`, the XHTML code returns to the keydef.xml file to look up targets. This means we resolve the key twice - once when figuring out link text and setting up keydef.xml, then again when publishing XHTML (and any other format that wants to make a link). This is done for `<ph>`, `<keyword>`, `<cite>`, `<dt>`, `<term>`, and `<indexterm>` (when `$INDEXSHOW is active). Suggestion: when we evaluate these elements during the keyref preprocess, we have easy access to linking information. We should set something like `@dita-ot:href` that can be easily interpreted by any final rendering step. When I suggested this to @jelovirt he pointed out we would also want `@dita-ot:format` and `@dita-ot:scope` The one element I'm iffy about is the specialized `<abbreviated-form>` element. The source element is EMPTY so if we populate it in the preprocess, later stages that expect normal DITA may be thrown off. Also, we can't know in preprocess whether to use an abbreviation or a long form (that depends on the final rendering context). Potentially, we could put both the long and abbreviated form inside of `<dita-ot:info>` or similar, which is not so risky and more easily ignored by later steps?
1.0
Remove keydef processing from XHTML rendering code - Currently keys are (for the most part) resolved in one preprocessing step. For linking elements, all is good. For phrase elements, it pulls in text when needed. The XHTML step then does extra processing that probably belongs inside the keyref process. Specifically, for non-linking elements that can't take `@href`, the XHTML code returns to the keydef.xml file to look up targets. This means we resolve the key twice - once when figuring out link text and setting up keydef.xml, then again when publishing XHTML (and any other format that wants to make a link). This is done for `<ph>`, `<keyword>`, `<cite>`, `<dt>`, `<term>`, and `<indexterm>` (when `$INDEXSHOW is active). Suggestion: when we evaluate these elements during the keyref preprocess, we have easy access to linking information. We should set something like `@dita-ot:href` that can be easily interpreted by any final rendering step. When I suggested this to @jelovirt he pointed out we would also want `@dita-ot:format` and `@dita-ot:scope` The one element I'm iffy about is the specialized `<abbreviated-form>` element. The source element is EMPTY so if we populate it in the preprocess, later stages that expect normal DITA may be thrown off. Also, we can't know in preprocess whether to use an abbreviation or a long form (that depends on the final rendering context). Potentially, we could put both the long and abbreviated form inside of `<dita-ot:info>` or similar, which is not so risky and more easily ignored by later steps?
process
remove keydef processing from xhtml rendering code currently keys are for the most part resolved in one preprocessing step for linking elements all is good for phrase elements it pulls in text when needed the xhtml step then does extra processing that probably belongs inside the keyref process specifically for non linking elements that can t take href the xhtml code returns to the keydef xml file to look up targets this means we resolve the key twice once when figuring out link text and setting up keydef xml then again when publishing xhtml and any other format that wants to make a link this is done for and when indexshow is active suggestion when we evaluate these elements during the keyref preprocess we have easy access to linking information we should set something like dita ot href that can be easily interpreted by any final rendering step when i suggested this to jelovirt he pointed out we would also want dita ot format and dita ot scope the one element i m iffy about is the specialized element the source element is empty so if we populate it in the preprocess later stages that expect normal dita may be thrown off also we can t know in preprocess whether to use an abbreviation or a long form that depends on the final rendering context potentially we could put both the long and abbreviated form inside of or similar which is not so risky and more easily ignored by later steps
1
62,196
14,656,453,261
IssuesEvent
2020-12-28 13:27:38
fu1771695yongxie/svelte
https://api.github.com/repos/fu1771695yongxie/svelte
opened
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz
security vulnerability
## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary> <p> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: svelte/site/package.json</p> <p>Path to vulnerable library: svelte/site/node_modules/mkdirp/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - mocha-6.2.0.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: svelte/site/package.json</p> <p>Path to vulnerable library: svelte/site/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - core-7.6.0.tgz (Root Library) - json5-2.1.0.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/svelte/commit/3dbe53469a432eb9e35998b8093be96f2c64247e">3dbe53469a432eb9e35998b8093be96f2c64247e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz - ## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary> <p> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: svelte/site/package.json</p> <p>Path to vulnerable library: svelte/site/node_modules/mkdirp/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - mocha-6.2.0.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: svelte/site/package.json</p> <p>Path to vulnerable library: svelte/site/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - core-7.6.0.tgz (Root Library) - json5-2.1.0.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/svelte/commit/3dbe53469a432eb9e35998b8093be96f2c64247e">3dbe53469a432eb9e35998b8093be96f2c64247e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in minimist tgz minimist tgz cve medium severity vulnerability vulnerable libraries minimist tgz minimist tgz minimist tgz parse argument options library home page a href path to dependency file svelte site package json path to vulnerable library svelte site node modules mkdirp node modules minimist package json dependency hierarchy mocha tgz root library mkdirp tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file svelte site package json path to vulnerable library svelte site node modules minimist package json dependency hierarchy core tgz root library tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution minimist step up your open source security game with whitesource
0
84,963
15,728,374,429
IssuesEvent
2021-03-29 13:45:17
ssobue/oauth2-provider
https://api.github.com/repos/ssobue/oauth2-provider
closed
CVE-2020-36189 (High) detected in jackson-databind-2.9.9.jar
security vulnerability
## CVE-2020-36189 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /oauth2-provider/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library) - spring-boot-starter-json-2.1.5.RELEASE.jar - :x: **jackson-databind-2.9.9.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-36189 (High) detected in jackson-databind-2.9.9.jar - ## CVE-2020-36189 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /oauth2-provider/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.9/jackson-databind-2.9.9.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library) - spring-boot-starter-json-2.1.5.RELEASE.jar - :x: **jackson-databind-2.9.9.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource. <p>Publish Date: 2021-01-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p> <p>Release Date: 2021-01-06</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file provider pom xml path to vulnerable library root repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library spring boot starter json release jar x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db drivermanagerconnectionsource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
122,284
10,218,497,367
IssuesEvent
2019-08-15 16:04:36
microsoft/PTVS
https://api.github.com/repos/microsoft/PTVS
closed
Tests are not removed from test explorer after test is renamed.
area:Legacy Test Adapter area:Test Adapter bug
repro step: 1.File, New Project, Python, Python Application 2.Add, New Item, Python Unit Test, make sure test1.py was added in python application 3.Open Test explorer,test1 should be found in that 4.Click test1.py and right click to select "Copy" 5.Paste in python application and make sure test1-Copy.py was in python application 6.We will found test1-Copy was found in Test Explorer 7.Right click in test1-Copy and press "F2" to rename, the new test name was test2 Expect: The test in Test Explorer should be test2 ,and test1-Copy was recovered by test2 Actual: test1-Copy was still in Test Explorer and test2 was also exist in TE **and at this time , click "Run All" in TE, An exception occurred in output window and "Run all" does not work: Not a valid Python module: C:\Users\vwazho\source\repos\PythonApplication1\PythonApplication1\test1 - Copy.py** ![test explorer](https://user-images.githubusercontent.com/39871640/61112047-8fcf1c00-a4bd-11e9-9c77-c687ae842df6.gif) ![image](https://user-images.githubusercontent.com/39871640/61114252-85fbe780-a4c2-11e9-80f0-15cfadb8dc90.png)
2.0
Tests are not removed from test explorer after test is renamed. - repro step: 1.File, New Project, Python, Python Application 2.Add, New Item, Python Unit Test, make sure test1.py was added in python application 3.Open Test explorer,test1 should be found in that 4.Click test1.py and right click to select "Copy" 5.Paste in python application and make sure test1-Copy.py was in python application 6.We will found test1-Copy was found in Test Explorer 7.Right click in test1-Copy and press "F2" to rename, the new test name was test2 Expect: The test in Test Explorer should be test2 ,and test1-Copy was recovered by test2 Actual: test1-Copy was still in Test Explorer and test2 was also exist in TE **and at this time , click "Run All" in TE, An exception occurred in output window and "Run all" does not work: Not a valid Python module: C:\Users\vwazho\source\repos\PythonApplication1\PythonApplication1\test1 - Copy.py** ![test explorer](https://user-images.githubusercontent.com/39871640/61112047-8fcf1c00-a4bd-11e9-9c77-c687ae842df6.gif) ![image](https://user-images.githubusercontent.com/39871640/61114252-85fbe780-a4c2-11e9-80f0-15cfadb8dc90.png)
non_process
tests are not removed from test explorer after test is renamed repro step file new project python python application add new item python unit test make sure py was added in python application open test explorer should be found in that click py and right click to select copy paste in python application and make sure copy py was in python application we will found copy was found in test explorer right click in copy and press to rename the new test name was expect the test in test explorer should be and copy was recovered by actual copy was still in test explorer and was also exist in te and at this time click run all in te an exception occurred in output window and run all does not work not a valid python module c users vwazho source repos copy py
0
19,016
3,412,221,342
IssuesEvent
2015-12-05 18:29:51
eloipuertas/ES2015A
https://api.github.com/repos/eloipuertas/ES2015A
opened
Implement Toon Shading
design Group A
**Description** * Add toon shading effect to the models **Outcome expected/acceptance criteria** * Integrate characters in the game with toon shading Estimated time effort: 3h
1.0
Implement Toon Shading - **Description** * Add toon shading effect to the models **Outcome expected/acceptance criteria** * Integrate characters in the game with toon shading Estimated time effort: 3h
non_process
implement toon shading description add toon shading effect to the models outcome expected acceptance criteria integrate characters in the game with toon shading estimated time effort
0
481,364
13,884,532,058
IssuesEvent
2020-10-18 16:31:52
yalla-coop/website-redesign
https://api.github.com/repos/yalla-coop/website-redesign
closed
Navbar width
bug priority 1
There is a problem with the navbar height and also the colors it doesn't work all the time https://www.loom.com/share/fc256d2cc5424a4e947e0354efdef0b0
1.0
Navbar width - There is a problem with the navbar height and also the colors it doesn't work all the time https://www.loom.com/share/fc256d2cc5424a4e947e0354efdef0b0
non_process
navbar width there is a problem with the navbar height and also the colors it doesn t work all the time
0
11,586
14,445,315,186
IssuesEvent
2020-12-07 22:44:35
panther-labs/panther
https://api.github.com/repos/panther-labs/panther
opened
Log Processor - Sophos optional field
bug team:data processing
### Describe the bug In a prior version of Sophos logs, the `datastream` field was not present. Panther marks this as a required field, meaning the system will not be able to process this older log format. This is particularly problematic when trying to backfill data, which can be important for security data like this where you might not know there was an incident worth investigating until months or years later. ### Steps to reproduce Steps to reproduce the behavior: 1. Onboard a sophos log source that does not specify the `datastream` field 2. Observe data fails to process ### Expected behavior Since a large amount of historical data may not have this field present, we should parse these logs anyways. ### Environment How are you deploying or using Panther? - Panther version 1.12.1 ### Additional context I think just making this field optional should be sufficient.
1.0
Log Processor - Sophos optional field - ### Describe the bug In a prior version of Sophos logs, the `datastream` field was not present. Panther marks this as a required field, meaning the system will not be able to process this older log format. This is particularly problematic when trying to backfill data, which can be important for security data like this where you might not know there was an incident worth investigating until months or years later. ### Steps to reproduce Steps to reproduce the behavior: 1. Onboard a sophos log source that does not specify the `datastream` field 2. Observe data fails to process ### Expected behavior Since a large amount of historical data may not have this field present, we should parse these logs anyways. ### Environment How are you deploying or using Panther? - Panther version 1.12.1 ### Additional context I think just making this field optional should be sufficient.
process
log processor sophos optional field describe the bug in a prior version of sophos logs the datastream field was not present panther marks this as a required field meaning the system will not be able to process this older log format this is particularly problematic when trying to backfill data which can be important for security data like this where you might not know there was an incident worth investigating until months or years later steps to reproduce steps to reproduce the behavior onboard a sophos log source that does not specify the datastream field observe data fails to process expected behavior since a large amount of historical data may not have this field present we should parse these logs anyways environment how are you deploying or using panther panther version additional context i think just making this field optional should be sufficient
1
17,394
23,210,034,528
IssuesEvent
2022-08-02 09:19:58
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Sandboxing is ineffective against dangling symlink action inputs
type: support / not a bug (process) team-Local-Exec
### Description of the bug: Per internal discussion, the intended semantics of a symlink input is that the action should only have access to the symlink itself, not the file/directory it points to (unless explicitly included as an input). However, sandboxing is unable to prevent such an action from accessing the target of the symlink. Note that dangling symlinks are an intentionally supported feature, so I don't think we can fix this by outlawing the inclusion of a symlink in the inputs to an action unless accompanied by its target. ### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. Minimal repro at https://github.com/tjgq/bazel-dangling-symlink. Issue exists both on Linux and MacOS; `--experimental_use_sandboxfs` doesn't make a difference. ### Which operating system are you running Bazel on? Linux, MacOS ### What is the output of `bazel info release`? development version ### If `bazel info release` returns `development version` or `(@non-git)`, tell us how you built Bazel. Bazel built near head @ d7eaa0b63fee7a2878a8cc78f780bda5f59e1c96. However, issue also exists in 5.2.0 (i.e., it's not a regression). ### What's the output of `git remote get-url origin; git rev-parse master; git rev-parse HEAD` ? ```text git@github.com:tjgq/bazel-dangling-symlink.git 7ff828a3fc348ddaa8114d434c735dc28cff6892 7ff828a3fc348ddaa8114d434c735dc28cff6892 ``` ### Have you found anything relevant by searching the web? _No response_ ### Any other information, logs, or outputs that you want to share? _No response_
1.0
Sandboxing is ineffective against dangling symlink action inputs - ### Description of the bug: Per internal discussion, the intended semantics of a symlink input is that the action should only have access to the symlink itself, not the file/directory it points to (unless explicitly included as an input). However, sandboxing is unable to prevent such an action from accessing the target of the symlink. Note that dangling symlinks are an intentionally supported feature, so I don't think we can fix this by outlawing the inclusion of a symlink in the inputs to an action unless accompanied by its target. ### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. Minimal repro at https://github.com/tjgq/bazel-dangling-symlink. Issue exists both on Linux and MacOS; `--experimental_use_sandboxfs` doesn't make a difference. ### Which operating system are you running Bazel on? Linux, MacOS ### What is the output of `bazel info release`? development version ### If `bazel info release` returns `development version` or `(@non-git)`, tell us how you built Bazel. Bazel built near head @ d7eaa0b63fee7a2878a8cc78f780bda5f59e1c96. However, issue also exists in 5.2.0 (i.e., it's not a regression). ### What's the output of `git remote get-url origin; git rev-parse master; git rev-parse HEAD` ? ```text git@github.com:tjgq/bazel-dangling-symlink.git 7ff828a3fc348ddaa8114d434c735dc28cff6892 7ff828a3fc348ddaa8114d434c735dc28cff6892 ``` ### Have you found anything relevant by searching the web? _No response_ ### Any other information, logs, or outputs that you want to share? _No response_
process
sandboxing is ineffective against dangling symlink action inputs description of the bug per internal discussion the intended semantics of a symlink input is that the action should only have access to the symlink itself not the file directory it points to unless explicitly included as an input however sandboxing is unable to prevent such an action from accessing the target of the symlink note that dangling symlinks are an intentionally supported feature so i don t think we can fix this by outlawing the inclusion of a symlink in the inputs to an action unless accompanied by its target what s the simplest easiest way to reproduce this bug please provide a minimal example if possible minimal repro at issue exists both on linux and macos experimental use sandboxfs doesn t make a difference which operating system are you running bazel on linux macos what is the output of bazel info release development version if bazel info release returns development version or non git tell us how you built bazel bazel built near head however issue also exists in i e it s not a regression what s the output of git remote get url origin git rev parse master git rev parse head text git github com tjgq bazel dangling symlink git have you found anything relevant by searching the web no response any other information logs or outputs that you want to share no response
1
813,609
30,464,069,731
IssuesEvent
2023-07-17 09:07:15
Avaiga/taipy-gui
https://api.github.com/repos/Avaiga/taipy-gui
closed
Enhance organization of complex applications
✨New feature 🟨 Priority: Medium
## Faced issue Building complex applications requires manipulating a lot of Markdown and code surrounding it to create the logic behind it. The goal is to find an efficient way to organize an application between several pages and several sections of your pages. A page with a lot of functionality will be harder to understand and maintain if represented in a single Markdown file and with a long Python code associated with it. This issue aims to have ways or best practices for creating complex applications in a structured and intuitive manner. This would help further improve the readability, maintainability, and navigation of applications, making the development process even more seamless. ## Desired outcome: - Clear and intuitive guidance for organizing applications into pages and sections - Enhanced readability and maintainability of code and markdown - Improved navigation within the application structure **Acceptance Criteria** - [ ] Ensure new code is unit tested, and check code coverage is at least 90% - [ ] Propagate any change on the demos and run all of them to ensure there is no breaking change - [ ] Ensure any change is well documented
1.0
Enhance organization of complex applications - ## Faced issue Building complex applications requires manipulating a lot of Markdown and code surrounding it to create the logic behind it. The goal is to find an efficient way to organize an application between several pages and several sections of your pages. A page with a lot of functionality will be harder to understand and maintain if represented in a single Markdown file and with a long Python code associated with it. This issue aims to have ways or best practices for creating complex applications in a structured and intuitive manner. This would help further improve the readability, maintainability, and navigation of applications, making the development process even more seamless. ## Desired outcome: - Clear and intuitive guidance for organizing applications into pages and sections - Enhanced readability and maintainability of code and markdown - Improved navigation within the application structure **Acceptance Criteria** - [ ] Ensure new code is unit tested, and check code coverage is at least 90% - [ ] Propagate any change on the demos and run all of them to ensure there is no breaking change - [ ] Ensure any change is well documented
non_process
enhance organization of complex applications faced issue building complex applications requires manipulating a lot of markdown and code surrounding it to create the logic behind it the goal is to find an efficient way to organize an application between several pages and several sections of your pages a page with a lot of functionality will be harder to understand and maintain if represented in a single markdown file and with a long python code associated with it this issue aims to have ways or best practices for creating complex applications in a structured and intuitive manner this would help further improve the readability maintainability and navigation of applications making the development process even more seamless desired outcome clear and intuitive guidance for organizing applications into pages and sections enhanced readability and maintainability of code and markdown improved navigation within the application structure acceptance criteria ensure new code is unit tested and check code coverage is at least propagate any change on the demos and run all of them to ensure there is no breaking change ensure any change is well documented
0
1,587
4,178,174,653
IssuesEvent
2016-06-22 05:06:06
ryankeefe92/Episodes
https://api.github.com/repos/ryankeefe92/Episodes
opened
Fix for fabrice bellard error
download: error handling process:
- A problem file that causes this error is in the "fabrice bellard error" folder in episodes in RK apps, so you can use it for testing - See if it works if you try again later (99% sure trying again later won't make a difference, however) - delete problem file (and its torrent), repeat last rss search (URL still saved in a variable somewhere? If not set it so it is stored as a variable [global?]), and have it filter out the exact torrent url that downloaded the error file (exact torrent url saved as a variable too? If not, set it so it also is stored as a [global?] variable, so it can run the same rss url search, but filter out the url for the torrent that downloaded the problem file), and then from the remaining options, choose the highest quality option -- make sure that this is still a higher quality than anything that has been already added to iTunes for that episode. - When a working file is finally added to iTunes for this episode, make sure a tag is included in the comments that says no-replace or something, but only if the bad file was of a higher quality than the one being added
1.0
Fix for fabrice bellard error - - A problem file that causes this error is in the "fabrice bellard error" folder in episodes in RK apps, so you can use it for testing - See if it works if you try again later (99% sure trying again later won't make a difference, however) - delete problem file (and its torrent), repeat last rss search (URL still saved in a variable somewhere? If not set it so it is stored as a variable [global?]), and have it filter out the exact torrent url that downloaded the error file (exact torrent url saved as a variable too? If not, set it so it also is stored as a [global?] variable, so it can run the same rss url search, but filter out the url for the torrent that downloaded the problem file), and then from the remaining options, choose the highest quality option -- make sure that this is still a higher quality than anything that has been already added to iTunes for that episode. - When a working file is finally added to iTunes for this episode, make sure a tag is included in the comments that says no-replace or something, but only if the bad file was of a higher quality than the one being added
process
fix for fabrice bellard error a problem file that causes this error is in the fabrice bellard error folder in episodes in rk apps so you can use it for testing see if it works if you try again later sure trying again later won t make a difference however delete problem file and its torrent repeat last rss search url still saved in a variable somewhere if not set it so it is stored as a variable and have it filter out the exact torrent url that downloaded the error file exact torrent url saved as a variable too if not set it so it also is stored as a variable so it can run the same rss url search but filter out the url for the torrent that downloaded the problem file and then from the remaining options choose the highest quality option make sure that this is still a higher quality than anything that has been already added to itunes for that episode when a working file is finally added to itunes for this episode make sure a tag is included in the comments that says no replace or something but only if the bad file was of a higher quality than the one being added
1
274,291
20,830,533,063
IssuesEvent
2022-03-19 11:05:06
CalcagnoLoic/landing_page_groupe
https://api.github.com/repos/CalcagnoLoic/landing_page_groupe
opened
Initialisation du projet
documentation
Hello la team, Alors j'ai initialisé le cadre de travail sur la branche `developement`. Dans cette branche, vous avez - [ ] le fichier html à la racine comprend déjà les `meta` ainsi que les liens `script` et `link` - [ ] le dossier assets avec : 1. `font` pour la police d'écriture 2. `img` qui contient les différentes images du mockup 3. `css` qui content le fichier SCSS (n'oubliez pas de watch sass dans vscode ;) ) 4. `js` qui contiendra nos évènements pour le menu hamburger de la version mobile Enfin, vous trouverez dans le readme le lien vers la planche figma ainsi que les 3 parties que j'ai sélectionné. Il reste donc 6 parties :)
1.0
Initialisation du projet - Hello la team, Alors j'ai initialisé le cadre de travail sur la branche `developement`. Dans cette branche, vous avez - [ ] le fichier html à la racine comprend déjà les `meta` ainsi que les liens `script` et `link` - [ ] le dossier assets avec : 1. `font` pour la police d'écriture 2. `img` qui contient les différentes images du mockup 3. `css` qui content le fichier SCSS (n'oubliez pas de watch sass dans vscode ;) ) 4. `js` qui contiendra nos évènements pour le menu hamburger de la version mobile Enfin, vous trouverez dans le readme le lien vers la planche figma ainsi que les 3 parties que j'ai sélectionné. Il reste donc 6 parties :)
non_process
initialisation du projet hello la team alors j ai initialisé le cadre de travail sur la branche developement dans cette branche vous avez le fichier html à la racine comprend déjà les meta ainsi que les liens script et link le dossier assets avec font pour la police d écriture img qui contient les différentes images du mockup css qui content le fichier scss n oubliez pas de watch sass dans vscode js qui contiendra nos évènements pour le menu hamburger de la version mobile enfin vous trouverez dans le readme le lien vers la planche figma ainsi que les parties que j ai sélectionné il reste donc parties
0
51,156
6,496,238,502
IssuesEvent
2017-08-22 09:17:29
python-trio/trio
https://api.github.com/repos/python-trio/trio
closed
High-level networking interface
design discussion missing piece
We should have something that's just a smidge higher level than `trio.socket`. This is something that could go into an external library, but: * I'd like to do TLS at this layer (so it's not bound to socket objects specifically), and in 2017 it really feels like TLS should be an included battery * Getting started with a simple echo server shouldn't require boilerplate or third-party extensions * I'm feeling the need for a basic stream abstraction in downstream libraries, and there are interoperability benefits to defining that in `trio` itself The goal wouldn't be to compete with like, all of netty or twisted or something, but just to provide some minimal extensible abstractions for endpoints, connecting, listening. Maybe UDP too, though that can wait. Things that should be easy: * Connecting to a given hostname+port, with ipv4/ipv6 taken care of (ideally with happy-eyeballs) * ...same as above, but with TLS enabled * Making a simple server that listens on both ipv4 and ipv6 * Making a simple server that listens on a unix domain port Maybe: * Plugging in some code to handle systemd socket activation * Plugging in some code to do something sensible with the [haproxy PROXY protocol](http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt) A source of inspiration is twisted's "endpoints": [tutorial](https://twistedmatrix.com/documents/current/core/howto/endpoints.html), [list of client endpoints](https://twistedmatrix.com/documents/current/api/twisted.internet.interfaces.IStreamClientEndpoint.html), [list of server endpoints](https://twistedmatrix.com/documents/current/api/twisted.internet.interfaces.IStreamServerEndpoint.html), [a much fancier server endpoint](https://txacme.readthedocs.io/en/stable/). I'm not a big fan of the plugins + string parsing thing, just because of the whole injecting-arbitrary-code-into-people's-processes thing that plugins do ... though I see the advantage when it comes to exposing this stuff as part of a public interface and wanting it to be extensible there. We probably want: * A basic Stream API. There's a sketch in `trio._streams`. This is useful in its own right; the idea would be to implement TLS as a stream wrapper e.g., and we can use a `SendStream` and `RecvStream` for subprocess pipes, etc. * Some sort of connect and listen factories for streams. Lots more TBD; for now this is a placeholder to think more about. Related: #9 (tls support), #13 (more ergonomic server quick-start), #14 (batched accept), #72 (socket option defaults – some might make more sense to set as part of the high-level interface)
1.0
High-level networking interface - We should have something that's just a smidge higher level than `trio.socket`. This is something that could go into an external library, but: * I'd like to do TLS at this layer (so it's not bound to socket objects specifically), and in 2017 it really feels like TLS should be an included battery * Getting started with a simple echo server shouldn't require boilerplate or third-party extensions * I'm feeling the need for a basic stream abstraction in downstream libraries, and there are interoperability benefits to defining that in `trio` itself The goal wouldn't be to compete with like, all of netty or twisted or something, but just to provide some minimal extensible abstractions for endpoints, connecting, listening. Maybe UDP too, though that can wait. Things that should be easy: * Connecting to a given hostname+port, with ipv4/ipv6 taken care of (ideally with happy-eyeballs) * ...same as above, but with TLS enabled * Making a simple server that listens on both ipv4 and ipv6 * Making a simple server that listens on a unix domain port Maybe: * Plugging in some code to handle systemd socket activation * Plugging in some code to do something sensible with the [haproxy PROXY protocol](http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt) A source of inspiration is twisted's "endpoints": [tutorial](https://twistedmatrix.com/documents/current/core/howto/endpoints.html), [list of client endpoints](https://twistedmatrix.com/documents/current/api/twisted.internet.interfaces.IStreamClientEndpoint.html), [list of server endpoints](https://twistedmatrix.com/documents/current/api/twisted.internet.interfaces.IStreamServerEndpoint.html), [a much fancier server endpoint](https://txacme.readthedocs.io/en/stable/). I'm not a big fan of the plugins + string parsing thing, just because of the whole injecting-arbitrary-code-into-people's-processes thing that plugins do ... though I see the advantage when it comes to exposing this stuff as part of a public interface and wanting it to be extensible there. We probably want: * A basic Stream API. There's a sketch in `trio._streams`. This is useful in its own right; the idea would be to implement TLS as a stream wrapper e.g., and we can use a `SendStream` and `RecvStream` for subprocess pipes, etc. * Some sort of connect and listen factories for streams. Lots more TBD; for now this is a placeholder to think more about. Related: #9 (tls support), #13 (more ergonomic server quick-start), #14 (batched accept), #72 (socket option defaults – some might make more sense to set as part of the high-level interface)
non_process
high level networking interface we should have something that s just a smidge higher level than trio socket this is something that could go into an external library but i d like to do tls at this layer so it s not bound to socket objects specifically and in it really feels like tls should be an included battery getting started with a simple echo server shouldn t require boilerplate or third party extensions i m feeling the need for a basic stream abstraction in downstream libraries and there are interoperability benefits to defining that in trio itself the goal wouldn t be to compete with like all of netty or twisted or something but just to provide some minimal extensible abstractions for endpoints connecting listening maybe udp too though that can wait things that should be easy connecting to a given hostname port with taken care of ideally with happy eyeballs same as above but with tls enabled making a simple server that listens on both and making a simple server that listens on a unix domain port maybe plugging in some code to handle systemd socket activation plugging in some code to do something sensible with the a source of inspiration is twisted s endpoints i m not a big fan of the plugins string parsing thing just because of the whole injecting arbitrary code into people s processes thing that plugins do though i see the advantage when it comes to exposing this stuff as part of a public interface and wanting it to be extensible there we probably want a basic stream api there s a sketch in trio streams this is useful in its own right the idea would be to implement tls as a stream wrapper e g and we can use a sendstream and recvstream for subprocess pipes etc some sort of connect and listen factories for streams lots more tbd for now this is a placeholder to think more about related tls support more ergonomic server quick start batched accept socket option defaults – some might make more sense to set as part of the high level interface
0
6,253
2,610,224,142
IssuesEvent
2015-02-26 19:11:04
chrsmith/somefinders
https://api.github.com/repos/chrsmith/somefinders
opened
Anvar G`aniev - Peshonamga sig`magan yorim
auto-migrated Priority-Medium Type-Defect
``` '''Арсен Жданов''' Привет всем не подскажите где можно найти .Anvar G`aniev - Peshonamga sig`magan yorim. как то выкладывали уже '''Винцент Чернов''' Качай тут http://bit.ly/19rdaLI '''Александр Шаров''' Просит ввести номер мобилы!Не опасно ли это? '''Альвиан Фомичёв''' Не это не влияет на баланс '''Всеволод Носов''' Неа все ок у меня ничего не списало Информация о файле: Anvar G`aniev - Peshonamga sig`magan yorim Загружен: В этом месяце Скачан раз: 1154 Рейтинг: 464 Средняя скорость скачивания: 444 Похожих файлов: 32 ``` ----- Original issue reported on code.google.com by `kondense...@gmail.com` on 18 Dec 2013 at 9:12
1.0
Anvar G`aniev - Peshonamga sig`magan yorim - ``` '''Арсен Жданов''' Привет всем не подскажите где можно найти .Anvar G`aniev - Peshonamga sig`magan yorim. как то выкладывали уже '''Винцент Чернов''' Качай тут http://bit.ly/19rdaLI '''Александр Шаров''' Просит ввести номер мобилы!Не опасно ли это? '''Альвиан Фомичёв''' Не это не влияет на баланс '''Всеволод Носов''' Неа все ок у меня ничего не списало Информация о файле: Anvar G`aniev - Peshonamga sig`magan yorim Загружен: В этом месяце Скачан раз: 1154 Рейтинг: 464 Средняя скорость скачивания: 444 Похожих файлов: 32 ``` ----- Original issue reported on code.google.com by `kondense...@gmail.com` on 18 Dec 2013 at 9:12
non_process
anvar g aniev peshonamga sig magan yorim арсен жданов привет всем не подскажите где можно найти anvar g aniev peshonamga sig magan yorim как то выкладывали уже винцент чернов качай тут александр шаров просит ввести номер мобилы не опасно ли это альвиан фомичёв не это не влияет на баланс всеволод носов неа все ок у меня ничего не списало информация о файле anvar g aniev peshonamga sig magan yorim загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at
0
22,724
32,043,332,697
IssuesEvent
2023-09-22 21:35:08
hashgraph/hedera-mirror-node
https://api.github.com/repos/hashgraph/hedera-mirror-node
closed
Release Checklist 0.88
enhancement process
### Problem We need a checklist to verify the release is rolled out successfully. ### Solution ## Preparation - [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc) - [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.88.0) - [x] GitHub checks for branch are passing - [x] No pre-release or snapshot dependencies present in build files - [x] Automated Kubernetes deployment successful - [x] Tag release - [x] Upload release artifacts - [x] Manual Submission for GCP Marketplace verification by google - [x] Publish marketplace release - [x] Publish release ## Performance - [x] Deployed - [x] gRPC API performance tests - [x] Importer performance tests - [x] REST API performance tests ## Previewnet - [x] Deployed ## Staging - [x] Deployed ## Testnet - [x] Deployed ## Mainnet - [x] Deployed to private - [x] Deployed to public ### Alternatives _No response_
1.0
Release Checklist 0.88 - ### Problem We need a checklist to verify the release is rolled out successfully. ### Solution ## Preparation - [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc) - [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.88.0) - [x] GitHub checks for branch are passing - [x] No pre-release or snapshot dependencies present in build files - [x] Automated Kubernetes deployment successful - [x] Tag release - [x] Upload release artifacts - [x] Manual Submission for GCP Marketplace verification by google - [x] Publish marketplace release - [x] Publish release ## Performance - [x] Deployed - [x] gRPC API performance tests - [x] Importer performance tests - [x] REST API performance tests ## Previewnet - [x] Deployed ## Staging - [x] Deployed ## Testnet - [x] Deployed ## Mainnet - [x] Deployed to private - [x] Deployed to public ### Alternatives _No response_
process
release checklist problem we need a checklist to verify the release is rolled out successfully solution preparation milestone field populated on relevant nothing open for github checks for branch are passing no pre release or snapshot dependencies present in build files automated kubernetes deployment successful tag release upload release artifacts manual submission for gcp marketplace verification by google publish marketplace release publish release performance deployed grpc api performance tests importer performance tests rest api performance tests previewnet deployed staging deployed testnet deployed mainnet deployed to private deployed to public alternatives no response
1
11,482
14,354,049,560
IssuesEvent
2020-11-30 08:02:11
Ghost-chu/QuickShop-Reremake
https://api.github.com/repos/Ghost-chu/QuickShop-Reremake
closed
[BUG]
Bug In Process Need Investigate Priority:Major
**Describe the bug** So, in this plugin when you make a chest shop, the sign and anything about the shop just says "INVALED MESSAGE:" **To Reproduce** Steps to reproduce the behavior: Just create a chest shop **Expected behavior** Not to see a warning sign. **Screenshots** **Paste link:** Execute command /qs paste, you will get a link contains your server information, paste it under this text. You must create a paste, except plugin completely won't work. If you create failed, you should find a paste file under the plugin/QuickShop folder. - 【 ![image](https://user-images.githubusercontent.com/75197904/100536030-b422ba00-31d2-11eb-932b-80d4d3afea4a.png) 】 **Additional context** Add any other context about the problem here.
1.0
[BUG] - **Describe the bug** So, in this plugin when you make a chest shop, the sign and anything about the shop just says "INVALED MESSAGE:" **To Reproduce** Steps to reproduce the behavior: Just create a chest shop **Expected behavior** Not to see a warning sign. **Screenshots** **Paste link:** Execute command /qs paste, you will get a link contains your server information, paste it under this text. You must create a paste, except plugin completely won't work. If you create failed, you should find a paste file under the plugin/QuickShop folder. - 【 ![image](https://user-images.githubusercontent.com/75197904/100536030-b422ba00-31d2-11eb-932b-80d4d3afea4a.png) 】 **Additional context** Add any other context about the problem here.
process
describe the bug so in this plugin when you make a chest shop the sign and anything about the shop just says invaled message to reproduce steps to reproduce the behavior just create a chest shop expected behavior not to see a warning sign screenshots paste link execute command qs paste you will get a link contains your server information paste it under this text you must create a paste except plugin completely won t work if you create failed you should find a paste file under the plugin quickshop folder 【 】 additional context add any other context about the problem here
1
17,333
23,151,762,216
IssuesEvent
2022-07-29 09:01:05
Tencent/tdesign-miniprogram
https://api.github.com/repos/Tencent/tdesign-miniprogram
closed
[avatar-group] class写法有问题
processing
### tdesign-miniprogram 版本 ^0.17.0 ### 重现链接 _No response_ ### 重现步骤 <view class="{{classPrefix}}__collapse--slot {{collapseAvatar ? '{{prefix}}-is-hidden' : ''}}"> <slot name="collapseAvatar" /> </view> prefix无法被识别 ### 期望结果 t-avatar-group__collapse--slot prefix t-is-hidden ### 实际结果 t-avatar-group__collapse--slot prefix -is-hidden ### 框架版本 小程序 ### 浏览器版本 _No response_ ### 系统版本 _No response_ ### Node版本 _No response_ ### 补充说明 _No response_
1.0
[avatar-group] class写法有问题 - ### tdesign-miniprogram 版本 ^0.17.0 ### 重现链接 _No response_ ### 重现步骤 <view class="{{classPrefix}}__collapse--slot {{collapseAvatar ? '{{prefix}}-is-hidden' : ''}}"> <slot name="collapseAvatar" /> </view> prefix无法被识别 ### 期望结果 t-avatar-group__collapse--slot prefix t-is-hidden ### 实际结果 t-avatar-group__collapse--slot prefix -is-hidden ### 框架版本 小程序 ### 浏览器版本 _No response_ ### 系统版本 _No response_ ### Node版本 _No response_ ### 补充说明 _No response_
process
class写法有问题 tdesign miniprogram 版本 重现链接 no response 重现步骤 prefix无法被识别 期望结果 t avatar group collapse slot prefix t is hidden 实际结果 t avatar group collapse slot prefix is hidden 框架版本 小程序 浏览器版本 no response 系统版本 no response node版本 no response 补充说明 no response
1
694,962
23,838,052,771
IssuesEvent
2022-09-06 07:57:37
ballerina-platform/ballerina-dev-website
https://api.github.com/repos/ballerina-platform/ballerina-dev-website
closed
Fix the rendering issue in tips of BBEs
Priority/Highest Type/Bug Area/BBEs Reason/EngineeringMistake
## Description Need to fix the rendering issue in tips of BBEs as shown below. For example, see [1]. <img width="1680" alt="Screenshot 2022-09-05 at 22 13 46" src="https://user-images.githubusercontent.com/11707273/188491103-bc27081e-6ad0-46c8-a0fc-84408bb58baa.png"> [1] https://pre-prod.ballerina.io/learn/by-example/grpc-bidirectional-streaming/ ## Steps to reproduce > The steps to be followed to reproduce the issue. ## Affected version(s) > The versions that are affected by the issue. ## Related website/documentation area > Add/Uncomment the relevant area label out of the following. Area/BBEs <!--Area/HomePageSamples--> <!--Area/LearnPages--> <!--Area/CommonPages--> <!--Area/Backend--> <!--Area/UIUX--> <!--Area/Workflows--> <!--Area/Blog--> ## Related issue(s) (optional) > Any related issues such as sub tasks and issues reported in other repositories (e.g., component repositories), similar problems, etc. ## Suggested label(s) (optional) > Optional comma-separated list of suggested labels. Non committers can’t assign labels to issues, and thereby, this will help issue creators who are not a committer to suggest possible labels. ## Suggested assignee(s) (optional) > Optional comma-separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, and thereby, this will help issue creators who are not a committer to suggest possible assignees.
1.0
Fix the rendering issue in tips of BBEs - ## Description Need to fix the rendering issue in tips of BBEs as shown below. For example, see [1]. <img width="1680" alt="Screenshot 2022-09-05 at 22 13 46" src="https://user-images.githubusercontent.com/11707273/188491103-bc27081e-6ad0-46c8-a0fc-84408bb58baa.png"> [1] https://pre-prod.ballerina.io/learn/by-example/grpc-bidirectional-streaming/ ## Steps to reproduce > The steps to be followed to reproduce the issue. ## Affected version(s) > The versions that are affected by the issue. ## Related website/documentation area > Add/Uncomment the relevant area label out of the following. Area/BBEs <!--Area/HomePageSamples--> <!--Area/LearnPages--> <!--Area/CommonPages--> <!--Area/Backend--> <!--Area/UIUX--> <!--Area/Workflows--> <!--Area/Blog--> ## Related issue(s) (optional) > Any related issues such as sub tasks and issues reported in other repositories (e.g., component repositories), similar problems, etc. ## Suggested label(s) (optional) > Optional comma-separated list of suggested labels. Non committers can’t assign labels to issues, and thereby, this will help issue creators who are not a committer to suggest possible labels. ## Suggested assignee(s) (optional) > Optional comma-separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, and thereby, this will help issue creators who are not a committer to suggest possible assignees.
non_process
fix the rendering issue in tips of bbes description need to fix the rendering issue in tips of bbes as shown below for example see img width alt screenshot at src steps to reproduce the steps to be followed to reproduce the issue affected version s the versions that are affected by the issue related website documentation area add uncomment the relevant area label out of the following area bbes related issue s optional any related issues such as sub tasks and issues reported in other repositories e g component repositories similar problems etc suggested label s optional optional comma separated list of suggested labels non committers can’t assign labels to issues and thereby this will help issue creators who are not a committer to suggest possible labels suggested assignee s optional optional comma separated list of suggested team members who should attend the issue non committers can’t assign issues to assignees and thereby this will help issue creators who are not a committer to suggest possible assignees
0
534,671
15,646,327,578
IssuesEvent
2021-03-23 00:38:48
vatSys/new-zealand-dataset
https://api.github.com/repos/vatSys/new-zealand-dataset
closed
New ADS-B Coverage Sites to be added
Priority: Next AIRAC airac
[Changes promulgated in AIP Supp. Bulletin 25 FEB 21](http://www.aip.net.nz/pdf/supplements/bulletin_25feb21.pdf) - [ ] **Rua o te Whenua** - [ ] **Te Weraiti** - [ ] **Ballance** - [ ] **Hawkings Hill** - [ ] **Mt Robertson** - [ ] **Christchurch Tower** - [ ] **Cass Peak** - [ ] **Cardrona** - [ ] **Coronet Peak** - [ ] **Mt Nicholas** - [ ] **Obelisk** [ENR 1.6 refers](http://www.aip.net.nz/pdf/ENR_1.6.pdf)
1.0
New ADS-B Coverage Sites to be added - [Changes promulgated in AIP Supp. Bulletin 25 FEB 21](http://www.aip.net.nz/pdf/supplements/bulletin_25feb21.pdf) - [ ] **Rua o te Whenua** - [ ] **Te Weraiti** - [ ] **Ballance** - [ ] **Hawkings Hill** - [ ] **Mt Robertson** - [ ] **Christchurch Tower** - [ ] **Cass Peak** - [ ] **Cardrona** - [ ] **Coronet Peak** - [ ] **Mt Nicholas** - [ ] **Obelisk** [ENR 1.6 refers](http://www.aip.net.nz/pdf/ENR_1.6.pdf)
non_process
new ads b coverage sites to be added rua o te whenua te weraiti ballance hawkings hill mt robertson christchurch tower cass peak cardrona coronet peak mt nicholas obelisk
0
8,927
12,034,864,210
IssuesEvent
2020-04-13 16:50:06
nanoframework/Home
https://api.github.com/repos/nanoframework/Home
closed
Array are only partially initialized
Area: Metadata Processor Status: FIXED Type: Bug
Arrays are only partially initialized if too many constants are specified. Sample Project: ```using System; using System.Threading; namespace NFApp1 { public class Program { public static void Main() { PixelFont7X9 _font1; Console.WriteLine("Hello world!"); _font1 = new PixelFont7X9(); // font 1 if (_font1._charmap[80] == 0) Console.WriteLine("Wrong, initialization incorrect!!!"); Thread.Sleep(Timeout.Infinite); } /// <summary> /// Represents a 7 X 9 (W x H) pixel font object to be used with nanoFramework display device drivers. /// </summary> public class PixelFont7X9 { public UInt32[] _charmap; public PixelFont7X9() { _charmap = new uint[] { 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // ! 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x00000000, 0x10000000, 0x10000000, // " 0x28000000, 0x28000000, 0x28000000, 0x28000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // # 0x28000000, 0x28000000, 0x28000000, 0xFE000000, 0x28000000, 0xFE000000, 0x28000000, 0x28000000, 0x28000000, // $ 0x10000000, 0x38000000, 0x54000000, 0x50000000, 0x38000000, 0x14000000, 0x54000000, 0x38000000, 0x10000000, // % 0xC2000000, 0xC6000000, 0x0C000000, 0x18000000, 0x30000000, 0x60000000, 0xC0000000, 0x86000000, 0x06000000, // & 0x30000000, 0x48000000, 0x28000000, 0x10000000, 0x28000000, 0x46000000, 0x86000000, 0x88000000, 0x70000000, // ' 0x10000000, 0x10000000, 0x10000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // ( 0x18000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x18000000 }; } } } } ```
1.0
Array are only partially initialized - Arrays are only partially initialized if too many constants are specified. Sample Project: ```using System; using System.Threading; namespace NFApp1 { public class Program { public static void Main() { PixelFont7X9 _font1; Console.WriteLine("Hello world!"); _font1 = new PixelFont7X9(); // font 1 if (_font1._charmap[80] == 0) Console.WriteLine("Wrong, initialization incorrect!!!"); Thread.Sleep(Timeout.Infinite); } /// <summary> /// Represents a 7 X 9 (W x H) pixel font object to be used with nanoFramework display device drivers. /// </summary> public class PixelFont7X9 { public UInt32[] _charmap; public PixelFont7X9() { _charmap = new uint[] { 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // ! 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x10000000, 0x00000000, 0x10000000, 0x10000000, // " 0x28000000, 0x28000000, 0x28000000, 0x28000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // # 0x28000000, 0x28000000, 0x28000000, 0xFE000000, 0x28000000, 0xFE000000, 0x28000000, 0x28000000, 0x28000000, // $ 0x10000000, 0x38000000, 0x54000000, 0x50000000, 0x38000000, 0x14000000, 0x54000000, 0x38000000, 0x10000000, // % 0xC2000000, 0xC6000000, 0x0C000000, 0x18000000, 0x30000000, 0x60000000, 0xC0000000, 0x86000000, 0x06000000, // & 0x30000000, 0x48000000, 0x28000000, 0x10000000, 0x28000000, 0x46000000, 0x86000000, 0x88000000, 0x70000000, // ' 0x10000000, 0x10000000, 0x10000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, 0x00000000, // ( 0x18000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x20000000, 0x18000000 }; } } } } ```
process
array are only partially initialized arrays are only partially initialized if too many constants are specified sample project using system using system threading namespace public class program public static void main console writeline hello world new font if charmap console writeline wrong initialization incorrect thread sleep timeout infinite represents a x w x h pixel font object to be used with nanoframework display device drivers public class public charmap public charmap new uint
1
36,182
12,403,296,396
IssuesEvent
2020-05-21 13:40:31
jgeraigery/FHIR
https://api.github.com/repos/jgeraigery/FHIR
opened
CVE-2020-7598 (Medium) detected in minimist-1.2.0.tgz, minimist-0.0.8.tgz
security vulnerability
## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-1.2.0.tgz</b>, <b>minimist-0.0.8.tgz</b></p></summary> <p> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/FHIR/docs/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/FHIR/docs/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - gatsby-2.13.39.tgz (Root Library) - chokidar-2.1.2.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - rc-1.2.8.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/FHIR/docs/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/FHIR/docs/node_modules/mkdirp/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - gatsby-2.13.39.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/FHIR/commit/3cf9071d6b901e3ec4560a587a8813f5218a7fa4">3cf9071d6b901e3ec4560a587a8813f5218a7fa4</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","isTransitiveDependency":true,"dependencyTree":"gatsby:2.13.39;chokidar:2.1.2;fsevents:1.2.9;node-pre-gyp:0.12.0;rc:1.2.8;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"0.0.8","isTransitiveDependency":true,"dependencyTree":"gatsby:2.13.39;mkdirp:0.5.1;minimist:0.0.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-7598 (Medium) detected in minimist-1.2.0.tgz, minimist-0.0.8.tgz - ## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-1.2.0.tgz</b>, <b>minimist-0.0.8.tgz</b></p></summary> <p> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/FHIR/docs/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/FHIR/docs/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - gatsby-2.13.39.tgz (Root Library) - chokidar-2.1.2.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - rc-1.2.8.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/FHIR/docs/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/FHIR/docs/node_modules/mkdirp/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - gatsby-2.13.39.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/FHIR/commit/3cf9071d6b901e3ec4560a587a8813f5218a7fa4">3cf9071d6b901e3ec4560a587a8813f5218a7fa4</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","isTransitiveDependency":true,"dependencyTree":"gatsby:2.13.39;chokidar:2.1.2;fsevents:1.2.9;node-pre-gyp:0.12.0;rc:1.2.8;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"0.0.8","isTransitiveDependency":true,"dependencyTree":"gatsby:2.13.39;mkdirp:0.5.1;minimist:0.0.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in minimist tgz minimist tgz cve medium severity vulnerability vulnerable libraries minimist tgz minimist tgz minimist tgz parse argument options library home page a href path to dependency file tmp ws scm fhir docs package json path to vulnerable library tmp ws scm fhir docs node modules minimist package json dependency hierarchy gatsby tgz root library chokidar tgz fsevents tgz node pre gyp tgz rc tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file tmp ws scm fhir docs package json path to vulnerable library tmp ws scm fhir docs node modules mkdirp node modules minimist package json dependency hierarchy gatsby tgz root library mkdirp tgz x minimist tgz vulnerable library found in head commit a href vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution minimist isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload vulnerabilityurl
0
329,349
10,015,048,054
IssuesEvent
2019-07-15 19:04:30
firelab/windninja
https://api.github.com/repos/firelab/windninja
opened
Segfault in point initialization with custom wx station file
component:point priority:high severity:segfault
Necessary files to reproduce are available [here](weather.firelab.org/ninja/windninja). From the user: I write because I am failing in running an own ‘station’ example (attatched .tif and .csv file). I am running WindNinja under ‘wine’ and I have excecuted all the Tutorial examples without any problem. You will notice that the attatched files have a ficticious name. I decided to use the ‘missoula_valley’ case as a template to avoid files/folders naming troubles with WindNinja. I put the .csv file inside a folder called ‘WXSTATIONS-2018-06-25-1237-missoula_valley’. Can you run this example? What I am doing wrong? [station_error.txt](https://github.com/firelab/windninja/files/3393774/station_error.txt)
1.0
Segfault in point initialization with custom wx station file - Necessary files to reproduce are available [here](weather.firelab.org/ninja/windninja). From the user: I write because I am failing in running an own ‘station’ example (attatched .tif and .csv file). I am running WindNinja under ‘wine’ and I have excecuted all the Tutorial examples without any problem. You will notice that the attatched files have a ficticious name. I decided to use the ‘missoula_valley’ case as a template to avoid files/folders naming troubles with WindNinja. I put the .csv file inside a folder called ‘WXSTATIONS-2018-06-25-1237-missoula_valley’. Can you run this example? What I am doing wrong? [station_error.txt](https://github.com/firelab/windninja/files/3393774/station_error.txt)
non_process
segfault in point initialization with custom wx station file necessary files to reproduce are available weather firelab org ninja windninja from the user i write because i am failing in running an own ‘station’ example attatched tif and csv file i am running windninja under ‘wine’ and i have excecuted all the tutorial examples without any problem you will notice that the attatched files have a ficticious name i decided to use the ‘missoula valley’ case as a template to avoid files folders naming troubles with windninja i put the csv file inside a folder called ‘wxstations missoula valley’ can you run this example what i am doing wrong
0
3,470
2,765,577,794
IssuesEvent
2015-04-29 21:18:52
OpenInternet/SAFETAG
https://api.github.com/repos/OpenInternet/SAFETAG
closed
drop materials needed, merge any existing into prep
documentation
Materials needed is essentially the intro to prep anyway
1.0
drop materials needed, merge any existing into prep - Materials needed is essentially the intro to prep anyway
non_process
drop materials needed merge any existing into prep materials needed is essentially the intro to prep anyway
0
235,860
19,432,388,033
IssuesEvent
2021-12-21 13:30:22
getsentry/sentry-cocoa
https://api.github.com/repos/getsentry/sentry-cocoa
closed
Failing Web Integration Tests
Type: Testing
`AuthenticationInterceptorTestCase.testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried()` fails around 30% of the time. ``` Alamofire_iOS_Tests.AuthenticationInterceptorTestCase testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("Optional("***")") is not equal to ("Optional("***")") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:387 // Then XCTAssertEqual(response?.request?.headers["Authorization"], "***") testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("2") is not equal to ("1") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:398 XCTAssertEqual(authenticator.applyCount, 1) XCTAssertEqual(authenticator.refreshCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:399 XCTAssertEqual(authenticator.applyCount, 1) XCTAssertEqual(authenticator.refreshCount, 0) XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("2") is not equal to ("1") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:400 XCTAssertEqual(authenticator.refreshCount, 0) XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) XCTAssertEqual(authenticator.isRequestAuthenticatedWithCredentialCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:401 XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) XCTAssertEqual(authenticator.isRequestAuthenticatedWithCredentialCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:403 XCTAssertEqual(request.retryCount, 0) } ```
1.0
Failing Web Integration Tests - `AuthenticationInterceptorTestCase.testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried()` fails around 30% of the time. ``` Alamofire_iOS_Tests.AuthenticationInterceptorTestCase testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("Optional("***")") is not equal to ("Optional("***")") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:387 // Then XCTAssertEqual(response?.request?.headers["Authorization"], "***") testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("2") is not equal to ("1") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:398 XCTAssertEqual(authenticator.applyCount, 1) XCTAssertEqual(authenticator.refreshCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:399 XCTAssertEqual(authenticator.applyCount, 1) XCTAssertEqual(authenticator.refreshCount, 0) XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("2") is not equal to ("1") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:400 XCTAssertEqual(authenticator.refreshCount, 0) XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) XCTAssertEqual(authenticator.isRequestAuthenticatedWithCredentialCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:401 XCTAssertEqual(authenticator.didRequestFailDueToAuthErrorCount, 1) XCTAssertEqual(authenticator.isRequestAuthenticatedWithCredentialCount, 0) testThatInterceptorThrowsMissingCredentialErrorWhenCredentialIsNilAndRequestShouldBeRetried, XCTAssertEqual failed: ("1") is not equal to ("0") /Users/runner/work/sentry-cocoa/sentry-cocoa/Tests/AuthenticationInterceptorTests.swift:403 XCTAssertEqual(request.retryCount, 0) } ```
non_process
failing web integration tests authenticationinterceptortestcase testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried fails around of the time alamofire ios tests authenticationinterceptortestcase testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed optional is not equal to optional users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift then xctassertequal response request headers testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed is not equal to users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift xctassertequal authenticator applycount xctassertequal authenticator refreshcount testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed is not equal to users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift xctassertequal authenticator applycount xctassertequal authenticator refreshcount xctassertequal authenticator didrequestfailduetoautherrorcount testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed is not equal to users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift xctassertequal authenticator refreshcount xctassertequal authenticator didrequestfailduetoautherrorcount xctassertequal authenticator isrequestauthenticatedwithcredentialcount testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed is not equal to users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift xctassertequal authenticator didrequestfailduetoautherrorcount xctassertequal authenticator isrequestauthenticatedwithcredentialcount testthatinterceptorthrowsmissingcredentialerrorwhencredentialisnilandrequestshouldberetried xctassertequal failed is not equal to users runner work sentry cocoa sentry cocoa tests authenticationinterceptortests swift xctassertequal request retrycount
0
125,310
4,955,398,903
IssuesEvent
2016-12-01 20:17:56
yairodriguez/AngularJS
https://api.github.com/repos/yairodriguez/AngularJS
opened
Bootstrapping
[priority] low [status] accepted [type] feature
### Description Finish the implementation of AngularJS, which at this point, should have all the core features the Angular itself has. Everything about _change detection_, _expressions_, _dependency injection_, _directives_, _controllers_, _transclusion_, _templates_, _interpolation_, _promises_, and `$http`. --- ### Issue Checklist - [ ] How the `ngClick` directive works. - [ ] What the manual bootstrap process entails: setting up an injector, compiling and linking the DOM, and running a digest. - [ ] How automatic bootstrapping finds the `ng-app` DOM element automatically and then runs the bootstrap function automatically. - [ ] How production bundles of the framework can be generated. - [ ] How everything ties together and how all the code we’ve written supports actual “Angular lite” applications. --- ### Assignees - [ ] Final assign @yairodriguez
1.0
Bootstrapping - ### Description Finish the implementation of AngularJS, which at this point, should have all the core features the Angular itself has. Everything about _change detection_, _expressions_, _dependency injection_, _directives_, _controllers_, _transclusion_, _templates_, _interpolation_, _promises_, and `$http`. --- ### Issue Checklist - [ ] How the `ngClick` directive works. - [ ] What the manual bootstrap process entails: setting up an injector, compiling and linking the DOM, and running a digest. - [ ] How automatic bootstrapping finds the `ng-app` DOM element automatically and then runs the bootstrap function automatically. - [ ] How production bundles of the framework can be generated. - [ ] How everything ties together and how all the code we’ve written supports actual “Angular lite” applications. --- ### Assignees - [ ] Final assign @yairodriguez
non_process
bootstrapping description finish the implementation of angularjs which at this point should have all the core features the angular itself has everything about change detection expressions dependency injection directives controllers transclusion templates interpolation promises and http issue checklist how the ngclick directive works what the manual bootstrap process entails setting up an injector compiling and linking the dom and running a digest how automatic bootstrapping finds the ng app dom element automatically and then runs the bootstrap function automatically how production bundles of the framework can be generated how everything ties together and how all the code we’ve written supports actual “angular lite” applications assignees final assign yairodriguez
0
159,373
24,983,463,474
IssuesEvent
2022-11-02 13:30:55
ta-mu-aa/workout-share
https://api.github.com/repos/ta-mu-aa/workout-share
closed
投稿用APIとの繋ぎ込みを行い投稿ができるようにする
feature Front design-layout
## 概要 投稿用APIとの繋ぎ込みを行い投稿ができるようにする。 投稿用のフォームは作成しているため投稿ボタンを押下することで投稿できるようにする 削除と更新のボタンに関してはまだないためこのイシューで対応する。
1.0
投稿用APIとの繋ぎ込みを行い投稿ができるようにする - ## 概要 投稿用APIとの繋ぎ込みを行い投稿ができるようにする。 投稿用のフォームは作成しているため投稿ボタンを押下することで投稿できるようにする 削除と更新のボタンに関してはまだないためこのイシューで対応する。
non_process
投稿用apiとの繋ぎ込みを行い投稿ができるようにする 概要 投稿用apiとの繋ぎ込みを行い投稿ができるようにする。 投稿用のフォームは作成しているため投稿ボタンを押下することで投稿できるようにする 削除と更新のボタンに関してはまだないためこのイシューで対応する。
0
7,715
10,820,914,754
IssuesEvent
2019-11-08 17:24:43
ContaoMonitoring/monitoring-response-time-graph
https://api.github.com/repos/ContaoMonitoring/monitoring-response-time-graph
closed
Show min, max and average of the response times
Feature ⚙ - Processed
Add option to show the min, max and average of the response times.
1.0
Show min, max and average of the response times - Add option to show the min, max and average of the response times.
process
show min max and average of the response times add option to show the min max and average of the response times
1
21,335
14,528,741,492
IssuesEvent
2020-12-14 16:54:51
dart-lang/dartdoc
https://api.github.com/repos/dart-lang/dartdoc
opened
sdk-analyzer test is broken
Infrastructure P0 bug
Looks like migration of _fe_analyzer_shared broke our analysis test?
1.0
sdk-analyzer test is broken - Looks like migration of _fe_analyzer_shared broke our analysis test?
non_process
sdk analyzer test is broken looks like migration of fe analyzer shared broke our analysis test
0
129,946
5,106,993,167
IssuesEvent
2017-01-05 13:34:06
cul-2016/quiz
https://api.github.com/repos/cul-2016/quiz
opened
Email autoverification issue
priority-4
In #248 we found that when emails are sent to certain addresses (specifically *@city.ac.uk), any links in the email are automatically followed before the recipient gets the email (in the city.ac.uk case it seems to be from IP address 70.39.157.195). This means that verification links are automatically clicked, and users can verify an email address without having access to the email account (or, potentially, without the email account existing). This means: - Users can sign up with an mistyped email address but not realise until they try to reset their password - Users who are aware of the issue can sign up for a lecturer account without the appropriate license. E.g., if City paid for a license, and UCL did not, someone at UCL could set up a lecturer account using a fictitious city.ac.uk email address and get full permission to set up and run quizzes. This is a relatively minor issue, but longer term it could be patched by, among other things: - Using a two-page verification process, where the link sent in the verification email pointed to an interstitial page that then required a user to press a button to verify, which then directed them to the verification page - Using a filter on the verification page based on IP address, User Agent or similar, to differentiate between human and bot link clicks.
1.0
Email autoverification issue - In #248 we found that when emails are sent to certain addresses (specifically *@city.ac.uk), any links in the email are automatically followed before the recipient gets the email (in the city.ac.uk case it seems to be from IP address 70.39.157.195). This means that verification links are automatically clicked, and users can verify an email address without having access to the email account (or, potentially, without the email account existing). This means: - Users can sign up with an mistyped email address but not realise until they try to reset their password - Users who are aware of the issue can sign up for a lecturer account without the appropriate license. E.g., if City paid for a license, and UCL did not, someone at UCL could set up a lecturer account using a fictitious city.ac.uk email address and get full permission to set up and run quizzes. This is a relatively minor issue, but longer term it could be patched by, among other things: - Using a two-page verification process, where the link sent in the verification email pointed to an interstitial page that then required a user to press a button to verify, which then directed them to the verification page - Using a filter on the verification page based on IP address, User Agent or similar, to differentiate between human and bot link clicks.
non_process
email autoverification issue in we found that when emails are sent to certain addresses specifically city ac uk any links in the email are automatically followed before the recipient gets the email in the city ac uk case it seems to be from ip address this means that verification links are automatically clicked and users can verify an email address without having access to the email account or potentially without the email account existing this means users can sign up with an mistyped email address but not realise until they try to reset their password users who are aware of the issue can sign up for a lecturer account without the appropriate license e g if city paid for a license and ucl did not someone at ucl could set up a lecturer account using a fictitious city ac uk email address and get full permission to set up and run quizzes this is a relatively minor issue but longer term it could be patched by among other things using a two page verification process where the link sent in the verification email pointed to an interstitial page that then required a user to press a button to verify which then directed them to the verification page using a filter on the verification page based on ip address user agent or similar to differentiate between human and bot link clicks
0
30,304
7,183,039,632
IssuesEvent
2018-02-01 11:56:35
teotidev/gstudio
https://api.github.com/repos/teotidev/gstudio
opened
Display - Global scale/key changes
code work
Need to figure out how to make the transpose optional and apply scales to individual patterns that will override the global scale.
1.0
Display - Global scale/key changes - Need to figure out how to make the transpose optional and apply scales to individual patterns that will override the global scale.
non_process
display global scale key changes need to figure out how to make the transpose optional and apply scales to individual patterns that will override the global scale
0
5,589
8,443,538,304
IssuesEvent
2018-10-18 15:50:12
aspnet/IISIntegration
https://api.github.com/repos/aspnet/IISIntegration
closed
Test failure: HttpsClientCert_GetCertInformation
Branch:2.2 Branch:master in-process test-failure
``` Failed Microsoft.AspNetCore.Server.IIS.FunctionalTests.ClientCertificateTests.HttpsClientCert_GetCertInformation(variant: Server: IISExpress, TFM: netcoreapp2.2, Type: Portable, Arch: x64, ANCM: V2, Host: InProcess) Error Message: Assert.Equal() Failure  (pos 0) Expected: Enabled;BAF8529DF729239A8D4E871846FD4AA08··· Actual: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML ···  (pos 0) Stack Trace: at Microsoft.AspNetCore.Server.IIS.FunctionalTests.ClientCertificateTests.ClientCertTest(TestVariant variant, Boolean sendClientCert) in /_/test/Common.FunctionalTests/ClientCertificateTests.cs:line 77 --- End of stack trace from previous location where exception was thrown --- Standard Output Messages: | [0.001s] TestLifetime Information: Starting test HttpsClientCert_GetCertInformation-Server: IISExpress, TFM: netcoreapp2.2, Type: Portable, Arch: x64, ANCM: V2, Host: InProcess at 2018-09-20T19:04:44 | [0.002s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Deploying [Variation] :: ServerType=IISExpress, Runtime=CoreClr, Arch=x64, BaseUrlHint=https://localhost:44300/, Publish=False | [0.002s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Executing: dotnet C:\projects\iisintegration\test\WebSites\InProcessWebSite\bin\x64\Release\netcoreapp2.2\InProcessWebSite.dll | [0.031s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: ContentRoot: C:\projects\iisintegration\test\WebSites\InProcessWebSite | [0.032s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Attempting to start IIS Express on port: 44300 | [0.034s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Debug: Saving Config to C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp | [0.034s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Executing command : C:\Program Files\IIS Express\iisexpress.exe /site:HttpTestSite /config:C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp /trace:error /systray:false | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Removing environment variable ASPNETCORE_ENVIRONMENT | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_DETAILEDERRORS=true | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_MODULE_DEBUG=console | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET LAUNCHER_PATH=dotnet | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET LAUNCHER_ARGS=C:\projects\iisintegration\test\WebSites\InProcessWebSite\bin\x64\Release\netcoreapp2.2\InProcessWebSite.dll | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_CONTENTROOT=C:\projects\iisintegration\test\WebSites\InProcessWebSite | [0.036s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress Process 6052 started | [0.104s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Starting IIS Express ... | [0.118s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Initializing the W3 Server Started CTC = 919656 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server initializing WinSock. CTC = 919671 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server WinSock initialized. CTC = 919671 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server ThreadPool initialized (ipm has signalled). CTC = 919671 | [0.140s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: [aspnetcorev2.dll] Initializing logs for 'C:\projects\iisintegration\test\IISExpress.FunctionalTests\bin\Release\netcoreapp2.2\x64\aspnetcorev2.dll'. Process Id: 6052.. File Version: 12.2.18263.0. Description: IIS ASP.NET Core Module V2. Commit: 2f4ebec7b5c08a307c934d3a53abf2efc78a2d1b. | [0.140s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Start listenerChannel http:0 | [0.142s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Successfully registered URL "https://localhost:44300/" for site "HttpTestSite" application "/" | [0.142s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Registration completed for site "HttpTestSite" | [0.143s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: AppPool 'IISExpressAppPool' initialized | [0.143s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: InitComplete event signalled | [0.144s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: IIS Express is running. | [0.144s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Started iisexpress successfully. Process Id : 6052, Port: 44300 | [0.146s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Application ready at URL: https://localhost:44300/ | [1.717s] HttpTestSite Debug: Method: GET, RequestUri: 'https://localhost:44300/GetClientCert', Version: 2.0, Content: <null>, Headers: | { | } | [1.729s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: IncrementMessages called | [1.837s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Request ended: https://localhost:44300/GetClientCert with HTTP status 403.16 | [1.850s] HttpTestSite Warning: StatusCode: 403, ReasonPhrase: 'Forbidden', Version: 1.1, Content: System.Net.Http.HttpConnection+HttpConnectionResponseContent, Headers: | { | Cache-Control: private | Server: Microsoft-IIS/10.0 | X-SourceFiles: =?UTF-8?B?QzpccHJvamVjdHNcaWlzaW50ZWdyYXRpb25cdGVzdFxXZWJTaXRlc1xJblByb2Nlc3NXZWJTaXRlXEdldENsaWVudENlcnQ=?= | X-Powered-By: ASP.NET | Date: Thu, 20 Sep 2018 19:04:46 GMT | Content-Type: text/html; charset=utf-8 | Content-Length: 5218 | } | [1.850s] HttpTestSite Warning: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>IIS 10.0 Detailed Error - 403.16 - Forbidden</title> <style type="text/css"> <!-- body{margin:0;font-size:.7em;font-family:Verdana,Arial,Helvetica,sans-serif;} code{margin:0;color:#006600;font-size:1.1em;font-weight:bold;} .config_source code{font-size:.8em;color:#000000;} pre{margin:0;font-size:1.4em;word-wrap:break-word;} ul,ol{margin:10px 0 10px 5px;} ul.first,ol.first{margin-top:5px;} fieldset{padding:0 15px 10px 15px;word-break:break-all;} .summary-container fieldset{padding-bottom:5px;margin-top:4px;} legend.no-expand-all{padding:2px 15px 4px 10px;margin:0 0 0 -12px;} legend{color:#333333;;margin:4px 0 8px -12px;_margin-top:0px; font-weight:bold;font-size:1em;} a:link,a:visited{color:#007EFF;font-weight:bold;} a:hover{text-decoration:none;} h1{font-size:2.4em;margin:0;color:#FFF;} h2{font-size:1.7em;margin:0;color:#CC0000;} h3{font-size:1.4em;margin:10px 0 0 0;color:#CC0000;} h4{font-size:1.2em;margin:10px 0 5px 0; }#header{width:96%;margin:0 0 0 0;padding:6px 2% 6px 2%;font-family:"trebuchet MS",Verdana,sans-serif; color:#FFF;background-color:#5C87B2; }#content{margin:0 0 0 2%;position:relative;} .summary-container,.content-container{background:#FFF;width:96%;margin-top:8px;padding:10px;position:relative;} .content-container p{margin:0 0 10px 0; }#details-left{width:35%;float:left;margin-right:2%; }#details-right{width:63%;float:left;overflow:hidden; }#server_version{width:96%;_height:1px;min-height:1px;margin:0 0 5px 0;padding:11px 2% 8px 2%;color:#FFFFFF; background-color:#5A7FA5;border-bottom:1px solid #C1CFDD;border-top:1px solid #4A6C8E;font-weight:normal; font-size:1em;color:#FFF;text-align:right; }#server_version p{margin:5px 0;} table{margin:4px 0 4px 0;width:100%;border:none;} td,th{vertical-align:top;padding:3px 0;text-align:left;font-weight:normal;border:none;} th{width:30%;text-align:right;padding-right:2%;font-weight:bold;} thead th{background-color:#ebebeb;width:25%; }#details-right th{width:20%;} table tr.alt td,table tr.alt th{} .highlight-code{color:#CC0000;font-weight:bold;font-style:italic;} .clear{clear:both;} .preferred{padding:0 5px 2px 5px;font-weight:normal;background:#006633;color:#FFF;font-size:.8em;} --> </style> </head> <body> <div id="content"> <div class="content-container"> <h3>HTTP Error 403.16 - Forbidden</h3> <h4>Your client certificate is either not trusted or is invalid.</h4> </div> <div class="content-container"> <fieldset><h4>Most likely causes:</h4> <ul> <li>The client certificate used for this request is not trusted by the Web server.</li> </ul> </fieldset> </div> <div class="content-container"> <fieldset><h4>Things you can try:</h4> <ul> <li>The client may have an old certificate selected for client authentication to this Web site. Close all open client windows, open a new browser window, and then select a valid certificate for client authentication.</li> <li>Verify that the client certificate is trusted by the Web server.</li> <li>Verify that the root certificate is properly installed and trusted on the Web server.</li> <li>Check the failed request tracing logs for additional information about this error. For more information, click <a href="http://go.microsoft.com/fwlink/?LinkID=66439">here</a>. </li> </ul> </fieldset> </div> <div class="content-container"> <fieldset><h4>Detailed Error Information:</h4> <div id="details-left"> <table border="0" cellpadding="0" cellspacing="0"> <tr class="alt"><th>Module</th><td> IIS Web Core</td></tr> <tr><th>Notification</th><td> BeginRequest</td></tr> <tr class="alt"><th>Handler</th><td> aspNetCore</td></tr> <tr><th>Error Code</th><td> 0x800b0109</td></tr> </table> </div> <div id="details-right"> <table border="0" cellpadding="0" cellspacing="0"> <tr class="alt"><th>Requested URL</th><td> https://localhost:44300/GetClientCert</td></tr> <tr><th>Physical Path</th><td> C:\projects\iisintegration\test\WebSites\InProcessWebSite\GetClientCert</td></tr> <tr class="alt"><th>Logon Method</th><td> Not yet determined</td></tr> <tr><th>Logon User</th><td> Not yet determined</td></tr> <tr class="alt"><th>Request Tracing Directory</th><td> C:\Users\appveyor\Documents\IISExpress\TraceLogFiles\HTTPTESTSITE</td></tr> </table> <div class="clear"></div> </div> </fieldset> </div> <div class="content-container"> <fieldset><h4>More Information:</h4> A Secure Sockets Layer (SSL) client certificate identifies you as a valid user of the resource. This error can occur if you choose a client certificate created by a Certificate Authority (CA) that is not trusted by the Web server. <p><a href="https://go.microsoft.com/fwlink/?LinkID=62293&IIS70Error=403,16,0x800b0109,14393">View more information »</a></p> </fieldset> </div> </div> </body> </html> | [1.853s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Attempting to cancel process 6052 | [1.915s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress Process 6052 shut down | [1.915s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Host process shutting down. | [1.917s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Successfully terminated host process with process Id '6052' | [1.917s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Debug: Deleting applicationHost.config file from C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp | [1.918s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: [Time]: Total time taken for this test variation '1.9160615' seconds ``` cc @jkotalik
1.0
Test failure: HttpsClientCert_GetCertInformation - ``` Failed Microsoft.AspNetCore.Server.IIS.FunctionalTests.ClientCertificateTests.HttpsClientCert_GetCertInformation(variant: Server: IISExpress, TFM: netcoreapp2.2, Type: Portable, Arch: x64, ANCM: V2, Host: InProcess) Error Message: Assert.Equal() Failure  (pos 0) Expected: Enabled;BAF8529DF729239A8D4E871846FD4AA08··· Actual: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML ···  (pos 0) Stack Trace: at Microsoft.AspNetCore.Server.IIS.FunctionalTests.ClientCertificateTests.ClientCertTest(TestVariant variant, Boolean sendClientCert) in /_/test/Common.FunctionalTests/ClientCertificateTests.cs:line 77 --- End of stack trace from previous location where exception was thrown --- Standard Output Messages: | [0.001s] TestLifetime Information: Starting test HttpsClientCert_GetCertInformation-Server: IISExpress, TFM: netcoreapp2.2, Type: Portable, Arch: x64, ANCM: V2, Host: InProcess at 2018-09-20T19:04:44 | [0.002s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Deploying [Variation] :: ServerType=IISExpress, Runtime=CoreClr, Arch=x64, BaseUrlHint=https://localhost:44300/, Publish=False | [0.002s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Executing: dotnet C:\projects\iisintegration\test\WebSites\InProcessWebSite\bin\x64\Release\netcoreapp2.2\InProcessWebSite.dll | [0.031s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: ContentRoot: C:\projects\iisintegration\test\WebSites\InProcessWebSite | [0.032s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Attempting to start IIS Express on port: 44300 | [0.034s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Debug: Saving Config to C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp | [0.034s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Executing command : C:\Program Files\IIS Express\iisexpress.exe /site:HttpTestSite /config:C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp /trace:error /systray:false | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Removing environment variable ASPNETCORE_ENVIRONMENT | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_DETAILEDERRORS=true | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_MODULE_DEBUG=console | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET LAUNCHER_PATH=dotnet | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET LAUNCHER_ARGS=C:\projects\iisintegration\test\WebSites\InProcessWebSite\bin\x64\Release\netcoreapp2.2\InProcessWebSite.dll | [0.035s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: SET ASPNETCORE_CONTENTROOT=C:\projects\iisintegration\test\WebSites\InProcessWebSite | [0.036s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress Process 6052 started | [0.104s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Starting IIS Express ... | [0.118s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Initializing the W3 Server Started CTC = 919656 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server initializing WinSock. CTC = 919671 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server WinSock initialized. CTC = 919671 | [0.122s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: W3 Server ThreadPool initialized (ipm has signalled). CTC = 919671 | [0.140s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: [aspnetcorev2.dll] Initializing logs for 'C:\projects\iisintegration\test\IISExpress.FunctionalTests\bin\Release\netcoreapp2.2\x64\aspnetcorev2.dll'. Process Id: 6052.. File Version: 12.2.18263.0. Description: IIS ASP.NET Core Module V2. Commit: 2f4ebec7b5c08a307c934d3a53abf2efc78a2d1b. | [0.140s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Start listenerChannel http:0 | [0.142s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Successfully registered URL "https://localhost:44300/" for site "HttpTestSite" application "/" | [0.142s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Registration completed for site "HttpTestSite" | [0.143s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: AppPool 'IISExpressAppPool' initialized | [0.143s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: InitComplete event signalled | [0.144s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: IIS Express is running. | [0.144s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Started iisexpress successfully. Process Id : 6052, Port: 44300 | [0.146s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Application ready at URL: https://localhost:44300/ | [1.717s] HttpTestSite Debug: Method: GET, RequestUri: 'https://localhost:44300/GetClientCert', Version: 2.0, Content: <null>, Headers: | { | } | [1.729s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: IncrementMessages called | [1.837s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress stdout: Request ended: https://localhost:44300/GetClientCert with HTTP status 403.16 | [1.850s] HttpTestSite Warning: StatusCode: 403, ReasonPhrase: 'Forbidden', Version: 1.1, Content: System.Net.Http.HttpConnection+HttpConnectionResponseContent, Headers: | { | Cache-Control: private | Server: Microsoft-IIS/10.0 | X-SourceFiles: =?UTF-8?B?QzpccHJvamVjdHNcaWlzaW50ZWdyYXRpb25cdGVzdFxXZWJTaXRlc1xJblByb2Nlc3NXZWJTaXRlXEdldENsaWVudENlcnQ=?= | X-Powered-By: ASP.NET | Date: Thu, 20 Sep 2018 19:04:46 GMT | Content-Type: text/html; charset=utf-8 | Content-Length: 5218 | } | [1.850s] HttpTestSite Warning: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>IIS 10.0 Detailed Error - 403.16 - Forbidden</title> <style type="text/css"> <!-- body{margin:0;font-size:.7em;font-family:Verdana,Arial,Helvetica,sans-serif;} code{margin:0;color:#006600;font-size:1.1em;font-weight:bold;} .config_source code{font-size:.8em;color:#000000;} pre{margin:0;font-size:1.4em;word-wrap:break-word;} ul,ol{margin:10px 0 10px 5px;} ul.first,ol.first{margin-top:5px;} fieldset{padding:0 15px 10px 15px;word-break:break-all;} .summary-container fieldset{padding-bottom:5px;margin-top:4px;} legend.no-expand-all{padding:2px 15px 4px 10px;margin:0 0 0 -12px;} legend{color:#333333;;margin:4px 0 8px -12px;_margin-top:0px; font-weight:bold;font-size:1em;} a:link,a:visited{color:#007EFF;font-weight:bold;} a:hover{text-decoration:none;} h1{font-size:2.4em;margin:0;color:#FFF;} h2{font-size:1.7em;margin:0;color:#CC0000;} h3{font-size:1.4em;margin:10px 0 0 0;color:#CC0000;} h4{font-size:1.2em;margin:10px 0 5px 0; }#header{width:96%;margin:0 0 0 0;padding:6px 2% 6px 2%;font-family:"trebuchet MS",Verdana,sans-serif; color:#FFF;background-color:#5C87B2; }#content{margin:0 0 0 2%;position:relative;} .summary-container,.content-container{background:#FFF;width:96%;margin-top:8px;padding:10px;position:relative;} .content-container p{margin:0 0 10px 0; }#details-left{width:35%;float:left;margin-right:2%; }#details-right{width:63%;float:left;overflow:hidden; }#server_version{width:96%;_height:1px;min-height:1px;margin:0 0 5px 0;padding:11px 2% 8px 2%;color:#FFFFFF; background-color:#5A7FA5;border-bottom:1px solid #C1CFDD;border-top:1px solid #4A6C8E;font-weight:normal; font-size:1em;color:#FFF;text-align:right; }#server_version p{margin:5px 0;} table{margin:4px 0 4px 0;width:100%;border:none;} td,th{vertical-align:top;padding:3px 0;text-align:left;font-weight:normal;border:none;} th{width:30%;text-align:right;padding-right:2%;font-weight:bold;} thead th{background-color:#ebebeb;width:25%; }#details-right th{width:20%;} table tr.alt td,table tr.alt th{} .highlight-code{color:#CC0000;font-weight:bold;font-style:italic;} .clear{clear:both;} .preferred{padding:0 5px 2px 5px;font-weight:normal;background:#006633;color:#FFF;font-size:.8em;} --> </style> </head> <body> <div id="content"> <div class="content-container"> <h3>HTTP Error 403.16 - Forbidden</h3> <h4>Your client certificate is either not trusted or is invalid.</h4> </div> <div class="content-container"> <fieldset><h4>Most likely causes:</h4> <ul> <li>The client certificate used for this request is not trusted by the Web server.</li> </ul> </fieldset> </div> <div class="content-container"> <fieldset><h4>Things you can try:</h4> <ul> <li>The client may have an old certificate selected for client authentication to this Web site. Close all open client windows, open a new browser window, and then select a valid certificate for client authentication.</li> <li>Verify that the client certificate is trusted by the Web server.</li> <li>Verify that the root certificate is properly installed and trusted on the Web server.</li> <li>Check the failed request tracing logs for additional information about this error. For more information, click <a href="http://go.microsoft.com/fwlink/?LinkID=66439">here</a>. </li> </ul> </fieldset> </div> <div class="content-container"> <fieldset><h4>Detailed Error Information:</h4> <div id="details-left"> <table border="0" cellpadding="0" cellspacing="0"> <tr class="alt"><th>Module</th><td> IIS Web Core</td></tr> <tr><th>Notification</th><td> BeginRequest</td></tr> <tr class="alt"><th>Handler</th><td> aspNetCore</td></tr> <tr><th>Error Code</th><td> 0x800b0109</td></tr> </table> </div> <div id="details-right"> <table border="0" cellpadding="0" cellspacing="0"> <tr class="alt"><th>Requested URL</th><td> https://localhost:44300/GetClientCert</td></tr> <tr><th>Physical Path</th><td> C:\projects\iisintegration\test\WebSites\InProcessWebSite\GetClientCert</td></tr> <tr class="alt"><th>Logon Method</th><td> Not yet determined</td></tr> <tr><th>Logon User</th><td> Not yet determined</td></tr> <tr class="alt"><th>Request Tracing Directory</th><td> C:\Users\appveyor\Documents\IISExpress\TraceLogFiles\HTTPTESTSITE</td></tr> </table> <div class="clear"></div> </div> </fieldset> </div> <div class="content-container"> <fieldset><h4>More Information:</h4> A Secure Sockets Layer (SSL) client certificate identifies you as a valid user of the resource. This error can occur if you choose a client certificate created by a Certificate Authority (CA) that is not trusted by the Web server. <p><a href="https://go.microsoft.com/fwlink/?LinkID=62293&IIS70Error=403,16,0x800b0109,14393">View more information »</a></p> </fieldset> </div> </div> </body> </html> | [1.853s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Attempting to cancel process 6052 | [1.915s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: iisexpress Process 6052 shut down | [1.915s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Host process shutting down. | [1.917s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: Successfully terminated host process with process Id '6052' | [1.917s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Debug: Deleting applicationHost.config file from C:\Users\appveyor\AppData\Local\Temp\1\tmp831.tmp | [1.918s] Microsoft.AspNetCore.Server.IntegrationTesting.IIS.IISExpressDeployer Information: [Time]: Total time taken for this test variation '1.9160615' seconds ``` cc @jkotalik
process
test failure httpsclientcert getcertinformation failed microsoft aspnetcore server iis functionaltests clientcertificatetests httpsclientcert getcertinformation variant server iisexpress tfm type portable arch ancm host inprocess error message assert equal failure  pos expected enabled ··· actual doctype html public dtd xhtml ···  pos stack trace at microsoft aspnetcore server iis functionaltests clientcertificatetests clientcerttest testvariant variant boolean sendclientcert in test common functionaltests clientcertificatetests cs line end of stack trace from previous location where exception was thrown standard output messages testlifetime information starting test httpsclientcert getcertinformation server iisexpress tfm type portable arch ancm host inprocess at microsoft aspnetcore server integrationtesting iis iisexpressdeployer information deploying servertype iisexpress runtime coreclr arch baseurlhint publish false microsoft aspnetcore server integrationtesting iis iisexpressdeployer information executing dotnet c projects iisintegration test websites inprocesswebsite bin release inprocesswebsite dll microsoft aspnetcore server integrationtesting iis iisexpressdeployer information contentroot c projects iisintegration test websites inprocesswebsite microsoft aspnetcore server integrationtesting iis iisexpressdeployer information attempting to start iis express on port microsoft aspnetcore server integrationtesting iis iisexpressdeployer debug saving config to c users appveyor appdata local temp tmp microsoft aspnetcore server integrationtesting iis iisexpressdeployer information executing command c program files iis express iisexpress exe site httptestsite config c users appveyor appdata local temp tmp trace error systray false microsoft aspnetcore server integrationtesting iis iisexpressdeployer information removing environment variable aspnetcore environment microsoft aspnetcore server integrationtesting iis iisexpressdeployer information set aspnetcore detailederrors true microsoft aspnetcore server integrationtesting iis iisexpressdeployer information set aspnetcore module debug console microsoft aspnetcore server integrationtesting iis iisexpressdeployer information set launcher path dotnet microsoft aspnetcore server integrationtesting iis iisexpressdeployer information set launcher args c projects iisintegration test websites inprocesswebsite bin release inprocesswebsite dll microsoft aspnetcore server integrationtesting iis iisexpressdeployer information set aspnetcore contentroot c projects iisintegration test websites inprocesswebsite microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress process started microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout starting iis express microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout initializing the server started ctc microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout server initializing winsock ctc microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout server winsock initialized ctc microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout server threadpool initialized ipm has signalled ctc microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout initializing logs for c projects iisintegration test iisexpress functionaltests bin release dll process id file version description iis asp net core module commit microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout start listenerchannel http microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout successfully registered url for site httptestsite application microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout registration completed for site httptestsite microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout apppool iisexpressapppool initialized microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout initcomplete event signalled microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout iis express is running microsoft aspnetcore server integrationtesting iis iisexpressdeployer information started iisexpress successfully process id port microsoft aspnetcore server integrationtesting iis iisexpressdeployer information application ready at url httptestsite debug method get requesturi version content headers microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout incrementmessages called microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress stdout request ended with http status httptestsite warning statuscode reasonphrase forbidden version content system net http httpconnection httpconnectionresponsecontent headers cache control private server microsoft iis x sourcefiles utf b x powered by asp net date thu sep gmt content type text html charset utf content length httptestsite warning doctype html public dtd xhtml strict en html xmlns iis detailed error forbidden body margin font size font family verdana arial helvetica sans serif code margin color font size font weight bold config source code font size color pre margin font size word wrap break word ul ol margin ul first ol first margin top fieldset padding word break break all summary container fieldset padding bottom margin top legend no expand all padding margin legend color margin margin top font weight bold font size a link a visited color font weight bold a hover text decoration none font size margin color fff font size margin color font size margin color font size margin header width margin padding font family trebuchet ms verdana sans serif color fff background color content margin position relative summary container content container background fff width margin top padding position relative content container p margin details left width float left margin right details right width float left overflow hidden server version width height min height margin padding color ffffff background color border bottom solid border top solid font weight normal font size color fff text align right server version p margin table margin width border none td th vertical align top padding text align left font weight normal border none th width text align right padding right font weight bold thead th background color ebebeb width details right th width table tr alt td table tr alt th highlight code color font weight bold font style italic clear clear both preferred padding font weight normal background color fff font size http error forbidden your client certificate is either not trusted or is invalid most likely causes the client certificate used for this request is not trusted by the web server things you can try the client may have an old certificate selected for client authentication to this web site close all open client windows open a new browser window and then select a valid certificate for client authentication verify that the client certificate is trusted by the web server verify that the root certificate is properly installed and trusted on the web server check the failed request tracing logs for additional information about this error for more information click detailed error information module iis web core notification beginrequest handler aspnetcore error code requested url physical path c projects iisintegration test websites inprocesswebsite getclientcert logon method not yet determined logon user not yet determined request tracing directory c users appveyor documents iisexpress tracelogfiles httptestsite more information a secure sockets layer ssl client certificate identifies you as a valid user of the resource this error can occur if you choose a client certificate created by a certificate authority ca that is not trusted by the web server microsoft aspnetcore server integrationtesting iis iisexpressdeployer information attempting to cancel process microsoft aspnetcore server integrationtesting iis iisexpressdeployer information iisexpress process shut down microsoft aspnetcore server integrationtesting iis iisexpressdeployer information host process shutting down microsoft aspnetcore server integrationtesting iis iisexpressdeployer information successfully terminated host process with process id microsoft aspnetcore server integrationtesting iis iisexpressdeployer debug deleting applicationhost config file from c users appveyor appdata local temp tmp microsoft aspnetcore server integrationtesting iis iisexpressdeployer information total time taken for this test variation seconds cc jkotalik
1
18,295
24,402,858,055
IssuesEvent
2022-10-05 04:08:01
swig/swig
https://api.github.com/repos/swig/swig
closed
Warn about invalid preprocessor expressions by default?
preprocessor
It seems SWIG quietly accepts `#if define XXX` (where `define` is an easy typo to make for `defined`). Testing with current git master (3.0.12 behaves the same way): ``` $ cat test.i #if define SWIGJAVA #error bang #endif $ ./preinst-swig -module foo -java test.i $ ./preinst-swig -module foo -python test.i $ ``` I would expect that an invalid expression in `#if` would give an error. GCC gives an error: ``` $ g++ -xc++ -E test.i # 1 "test.i" # 1 "<built-in>" # 1 "<command-line>" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 1 "<command-line>" 2 # 1 "test.i" test.i:1:12: error: missing binary operator before token "SWIGJAVA" #if define SWIGJAVA ^~~~~~~~ ```
1.0
Warn about invalid preprocessor expressions by default? - It seems SWIG quietly accepts `#if define XXX` (where `define` is an easy typo to make for `defined`). Testing with current git master (3.0.12 behaves the same way): ``` $ cat test.i #if define SWIGJAVA #error bang #endif $ ./preinst-swig -module foo -java test.i $ ./preinst-swig -module foo -python test.i $ ``` I would expect that an invalid expression in `#if` would give an error. GCC gives an error: ``` $ g++ -xc++ -E test.i # 1 "test.i" # 1 "<built-in>" # 1 "<command-line>" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 1 "<command-line>" 2 # 1 "test.i" test.i:1:12: error: missing binary operator before token "SWIGJAVA" #if define SWIGJAVA ^~~~~~~~ ```
process
warn about invalid preprocessor expressions by default it seems swig quietly accepts if define xxx where define is an easy typo to make for defined testing with current git master behaves the same way cat test i if define swigjava error bang endif preinst swig module foo java test i preinst swig module foo python test i i would expect that an invalid expression in if would give an error gcc gives an error g xc e test i test i usr include stdc predef h test i test i error missing binary operator before token swigjava if define swigjava
1
228,003
18,149,757,244
IssuesEvent
2021-09-26 04:05:45
Dj-Viking/vue3-typescript-bare-starter
https://api.github.com/repos/Dj-Viking/vue3-typescript-bare-starter
closed
FrontEnd: look up cypress local storage commands
Frontend Testing
because cypress trashes local storage a lot while running tests. and makes tests flaky that depend on local storage!! https://www.npmjs.com/package/cypress-localstorage-commands
1.0
FrontEnd: look up cypress local storage commands - because cypress trashes local storage a lot while running tests. and makes tests flaky that depend on local storage!! https://www.npmjs.com/package/cypress-localstorage-commands
non_process
frontend look up cypress local storage commands because cypress trashes local storage a lot while running tests and makes tests flaky that depend on local storage
0
328,873
28,139,024,654
IssuesEvent
2023-04-01 18:25:16
opentibiabr/canary
https://api.github.com/repos/opentibiabr/canary
closed
Depo crashed server
Type: Bug Priority: High Status: Pending Test
### Priority High ### Area - [ ] Datapack - [ ] Source - [ ] Map - [ ] Other ### What happened? This error crashed server and duplicated item from depo box to inbox container ### What OS are you seeing the problem on? Windows ### Code of Conduct - [X] I agree to follow this project's Code of Conduct
1.0
Depo crashed server - ### Priority High ### Area - [ ] Datapack - [ ] Source - [ ] Map - [ ] Other ### What happened? This error crashed server and duplicated item from depo box to inbox container ### What OS are you seeing the problem on? Windows ### Code of Conduct - [X] I agree to follow this project's Code of Conduct
non_process
depo crashed server priority high area datapack source map other what happened this error crashed server and duplicated item from depo box to inbox container what os are you seeing the problem on windows code of conduct i agree to follow this project s code of conduct
0
811,267
30,281,480,074
IssuesEvent
2023-07-08 05:34:23
Velocimeter/frontend
https://api.github.com/repos/Velocimeter/frontend
closed
Feat request - fallback router
wontfix least priority
Users and dunks wants to be able to fallback to default router when firebird is not connected so that they are able to do a trade.
1.0
Feat request - fallback router - Users and dunks wants to be able to fallback to default router when firebird is not connected so that they are able to do a trade.
non_process
feat request fallback router users and dunks wants to be able to fallback to default router when firebird is not connected so that they are able to do a trade
0
1,001
3,469,808,852
IssuesEvent
2015-12-23 00:51:57
DoSomething/MessageBroker-PHP
https://api.github.com/repos/DoSomething/MessageBroker-PHP
closed
Add itok to image request
mbc-image-processor
HTTP requests for reportback images need to include the `itok=<random string>` parameter .
1.0
Add itok to image request - HTTP requests for reportback images need to include the `itok=<random string>` parameter .
process
add itok to image request http requests for reportback images need to include the itok parameter
1
13,704
16,460,650,274
IssuesEvent
2021-05-21 18:26:53
graphql/graphql-spec
https://api.github.com/repos/graphql/graphql-spec
closed
Update legal & contribution text in spec
✏️ Editorial 🐝 Process
https://spec.graphql.org/draft/#sec-Introduction and https://spec.graphql.org/draft/#sec-Copyright-notice need to be updated with latest rules for governance and highlight contributors
1.0
Update legal & contribution text in spec - https://spec.graphql.org/draft/#sec-Introduction and https://spec.graphql.org/draft/#sec-Copyright-notice need to be updated with latest rules for governance and highlight contributors
process
update legal contribution text in spec and need to be updated with latest rules for governance and highlight contributors
1
55,442
14,008,921,273
IssuesEvent
2020-10-29 00:58:30
mwilliams7197/bootstrap
https://api.github.com/repos/mwilliams7197/bootstrap
closed
CVE-2018-20821 (Medium) detected in multiple libraries - autoclosed
security vulnerability
## CVE-2018-20821 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.11.0.tgz</b>, <b>opennmsopennms-source-25.1.0-1</b>, <b>opennmsopennms-source-24.1.2-1</b></p></summary> <p> <details><summary><b>node-sass-4.11.0.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p> <p>Path to dependency file: bootstrap/package.json</p> <p>Path to vulnerable library: bootstrap/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.11.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>v4-dev</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The parsing component in LibSass through 3.5.5 allows attackers to cause a denial-of-service (uncontrolled recursion in Sass::Parser::parse_css_variable_value in parser.cpp). <p>Publish Date: 2019-04-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20821>CVE-2018-20821</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20821">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20821</a></p> <p>Release Date: 2019-04-23</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.11.0","isTransitiveDependency":false,"dependencyTree":"node-sass:4.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-20821","vulnerabilityDetails":"The parsing component in LibSass through 3.5.5 allows attackers to cause a denial-of-service (uncontrolled recursion in Sass::Parser::parse_css_variable_value in parser.cpp).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20821","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-20821 (Medium) detected in multiple libraries - autoclosed - ## CVE-2018-20821 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.11.0.tgz</b>, <b>opennmsopennms-source-25.1.0-1</b>, <b>opennmsopennms-source-24.1.2-1</b></p></summary> <p> <details><summary><b>node-sass-4.11.0.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p> <p>Path to dependency file: bootstrap/package.json</p> <p>Path to vulnerable library: bootstrap/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.11.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>v4-dev</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The parsing component in LibSass through 3.5.5 allows attackers to cause a denial-of-service (uncontrolled recursion in Sass::Parser::parse_css_variable_value in parser.cpp). <p>Publish Date: 2019-04-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20821>CVE-2018-20821</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20821">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20821</a></p> <p>Release Date: 2019-04-23</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.11.0","isTransitiveDependency":false,"dependencyTree":"node-sass:4.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-20821","vulnerabilityDetails":"The parsing component in LibSass through 3.5.5 allows attackers to cause a denial-of-service (uncontrolled recursion in Sass::Parser::parse_css_variable_value in parser.cpp).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20821","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in multiple libraries autoclosed cve medium severity vulnerability vulnerable libraries node sass tgz opennmsopennms source opennmsopennms source node sass tgz wrapper around libsass library home page a href path to dependency file bootstrap package json path to vulnerable library bootstrap node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in base branch dev vulnerability details the parsing component in libsass through allows attackers to cause a denial of service uncontrolled recursion in sass parser parse css variable value in parser cpp publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails the parsing component in libsass through allows attackers to cause a denial of service uncontrolled recursion in sass parser parse css variable value in parser cpp vulnerabilityurl
0
18,131
24,168,453,076
IssuesEvent
2022-09-22 16:57:26
jgraley/inferno-cpp2v
https://api.github.com/repos/jgraley/inferno-cpp2v
opened
ValueSelector to jump ahead if suggestion set is empty
Constraint Processing
It's confusing to see ``` T002 )}>CV Intersection of suggstions is {} - make queue from this T002 )}>CV OK ``` It would be better to skip to whatever happens _after_ ``` T002 )}>CN No (more) values found ``` And maybe never create a ValueSelector instance at all (which would probably mean ValueSelector would get a static factory method in preference to constructing a NULL object or throwing from the constructor)
1.0
ValueSelector to jump ahead if suggestion set is empty - It's confusing to see ``` T002 )}>CV Intersection of suggstions is {} - make queue from this T002 )}>CV OK ``` It would be better to skip to whatever happens _after_ ``` T002 )}>CN No (more) values found ``` And maybe never create a ValueSelector instance at all (which would probably mean ValueSelector would get a static factory method in preference to constructing a NULL object or throwing from the constructor)
process
valueselector to jump ahead if suggestion set is empty it s confusing to see cv intersection of suggstions is make queue from this cv ok it would be better to skip to whatever happens after cn no more values found and maybe never create a valueselector instance at all which would probably mean valueselector would get a static factory method in preference to constructing a null object or throwing from the constructor
1
773,378
27,156,106,109
IssuesEvent
2023-02-17 07:58:51
hack4impact-calpoly/habitat-for-humanity
https://api.github.com/repos/hack4impact-calpoly/habitat-for-humanity
opened
Storing Images in S3
backend donor view priority
Store images in S3 and create and test backend methods to retrieve them. Images should only be stored once the "Submit" button is clicked on the Make a Donation - Review and Submit Page.
1.0
Storing Images in S3 - Store images in S3 and create and test backend methods to retrieve them. Images should only be stored once the "Submit" button is clicked on the Make a Donation - Review and Submit Page.
non_process
storing images in store images in and create and test backend methods to retrieve them images should only be stored once the submit button is clicked on the make a donation review and submit page
0
21,986
30,482,727,698
IssuesEvent
2023-07-17 21:52:40
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
roblox-pyc 1.16.34 has 3 GuardDog issues
guarddog silent-process-execution
https://pypi.org/project/roblox-pyc https://inspector.pypi.io/project/roblox-pyc ```{ "dependency": "roblox-pyc", "version": "1.16.34", "result": { "issues": 3, "errors": {}, "results": { "silent-process-execution": [ { "location": "roblox-pyc-1.16.34/src/robloxpy.py:86", "code": " subprocess.call([\"llvm-config\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" }, { "location": "roblox-pyc-1.16.34/src/robloxpy.py:110", "code": " subprocess.call([\"luarocks\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" }, { "location": "roblox-pyc-1.16.34/src/robloxpy.py:117", "code": " subprocess.call([\"moonc\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmp3dxw55sr/roblox-pyc" } }```
1.0
roblox-pyc 1.16.34 has 3 GuardDog issues - https://pypi.org/project/roblox-pyc https://inspector.pypi.io/project/roblox-pyc ```{ "dependency": "roblox-pyc", "version": "1.16.34", "result": { "issues": 3, "errors": {}, "results": { "silent-process-execution": [ { "location": "roblox-pyc-1.16.34/src/robloxpy.py:86", "code": " subprocess.call([\"llvm-config\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" }, { "location": "roblox-pyc-1.16.34/src/robloxpy.py:110", "code": " subprocess.call([\"luarocks\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" }, { "location": "roblox-pyc-1.16.34/src/robloxpy.py:117", "code": " subprocess.call([\"moonc\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmp3dxw55sr/roblox-pyc" } }```
process
roblox pyc has guarddog issues dependency roblox pyc version result issues errors results silent process execution location roblox pyc src robloxpy py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location roblox pyc src robloxpy py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location roblox pyc src robloxpy py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp roblox pyc
1
7,256
10,419,428,739
IssuesEvent
2019-09-15 16:27:21
nion-software/nionswift
https://api.github.com/repos/nion-software/nionswift
opened
Review cropped_display_xdata scaling
f - line-plot f - processing
In email from BP/ND on 2019-04-08, they divide a line plot by itself and end up with non 1.0 value.
1.0
Review cropped_display_xdata scaling - In email from BP/ND on 2019-04-08, they divide a line plot by itself and end up with non 1.0 value.
process
review cropped display xdata scaling in email from bp nd on they divide a line plot by itself and end up with non value
1
497,549
14,379,992,031
IssuesEvent
2020-12-02 01:42:37
jGaboardi/tigernet
https://api.github.com/repos/jGaboardi/tigernet
closed
upload+update+streamline code base
enhancement priority—high
- from [`jGaboardi_dissertation`](https://github.com/jGaboardi/jGaboardi_dissertation) structure ([`at1866_Master`](https://github.com/jGaboardi/jGaboardi_dissertation/tree/master/code/at1866_Master)): - [x] `tigernet/__init__.py` - [x] `tigernet/spaghetti.py` - [x] `tigernet/sauce.py` - [x] `tigernet/utils.py` - [x] `tigernet/dissertation_workflows.py` - to `jGaboardi/tigernet` structure: - [x] `tigernet/__init__.py` - [x] ~~`tigernet/spaghetti.py`~~ --> `tigernet/tigernet.py` - [ ] ~~`tigernet/sauce.py`~~ * `tigernet/dataprep.py`, `tigernet/netbuild.py`, `tigernet/point_funcs.py` - [x] ~~`tigernet/utils.py`~~ * `tigernet/utils.py` (utils+sauce+dissertation_workflows)? * integrate with others? - [x] ~~`tigernet/dissertation_workflows.py`~~
1.0
upload+update+streamline code base - - from [`jGaboardi_dissertation`](https://github.com/jGaboardi/jGaboardi_dissertation) structure ([`at1866_Master`](https://github.com/jGaboardi/jGaboardi_dissertation/tree/master/code/at1866_Master)): - [x] `tigernet/__init__.py` - [x] `tigernet/spaghetti.py` - [x] `tigernet/sauce.py` - [x] `tigernet/utils.py` - [x] `tigernet/dissertation_workflows.py` - to `jGaboardi/tigernet` structure: - [x] `tigernet/__init__.py` - [x] ~~`tigernet/spaghetti.py`~~ --> `tigernet/tigernet.py` - [ ] ~~`tigernet/sauce.py`~~ * `tigernet/dataprep.py`, `tigernet/netbuild.py`, `tigernet/point_funcs.py` - [x] ~~`tigernet/utils.py`~~ * `tigernet/utils.py` (utils+sauce+dissertation_workflows)? * integrate with others? - [x] ~~`tigernet/dissertation_workflows.py`~~
non_process
upload update streamline code base from structure tigernet init py tigernet spaghetti py tigernet sauce py tigernet utils py tigernet dissertation workflows py to jgaboardi tigernet structure tigernet init py tigernet spaghetti py tigernet tigernet py tigernet sauce py tigernet dataprep py tigernet netbuild py tigernet point funcs py tigernet utils py tigernet utils py utils sauce dissertation workflows integrate with others tigernet dissertation workflows py
0
19,367
25,496,228,024
IssuesEvent
2022-11-27 18:04:17
david-palm/DeepSudoku
https://api.github.com/repos/david-palm/DeepSudoku
closed
Identify lines
feature integral feature image processing
To cut the sudoku we first need to identify all lines of the sudoku. This will be achieved by using the Hough transform. The lines will be stored in an array written in the Hesse notation. The following step must be completed to achieve this: - [x] #5 - [x] #6
1.0
Identify lines - To cut the sudoku we first need to identify all lines of the sudoku. This will be achieved by using the Hough transform. The lines will be stored in an array written in the Hesse notation. The following step must be completed to achieve this: - [x] #5 - [x] #6
process
identify lines to cut the sudoku we first need to identify all lines of the sudoku this will be achieved by using the hough transform the lines will be stored in an array written in the hesse notation the following step must be completed to achieve this
1
15,340
19,488,258,023
IssuesEvent
2021-12-26 20:40:43
varabyte/kotter
https://api.github.com/repos/varabyte/kotter
closed
MAJOR CHANGE: Rename "konsole" to something else
enhancement process
The name Konsole sits too closely to KDE Konsole, which is confusing. So, this bug is a few parts... 1. Remove all references of the name "konsole" from class, method, and variable names, e.g. `konsoleApp` -> `session` and `konsoleVarOf` to `liveVarOf` * Once this is done, the code itself will no longer care about large refactorings of name changes in the future. 2. Come up with a new name for the library. I'm leaning on "Kotter" for "KOTlin TERminal" but I'm open to other ideas 3. Update the package / artifacts / README / screencasts / this repo name / our Discord channel etc. to match This is a pretty big change for this library. This may inconvenience users who are using it, so I'll give this bug a day or two just in case anyone wants to respond. But so far, I haven't seen much of an indication that people are actually using it yet, so it seems like a good time to make such a drastic change.
1.0
MAJOR CHANGE: Rename "konsole" to something else - The name Konsole sits too closely to KDE Konsole, which is confusing. So, this bug is a few parts... 1. Remove all references of the name "konsole" from class, method, and variable names, e.g. `konsoleApp` -> `session` and `konsoleVarOf` to `liveVarOf` * Once this is done, the code itself will no longer care about large refactorings of name changes in the future. 2. Come up with a new name for the library. I'm leaning on "Kotter" for "KOTlin TERminal" but I'm open to other ideas 3. Update the package / artifacts / README / screencasts / this repo name / our Discord channel etc. to match This is a pretty big change for this library. This may inconvenience users who are using it, so I'll give this bug a day or two just in case anyone wants to respond. But so far, I haven't seen much of an indication that people are actually using it yet, so it seems like a good time to make such a drastic change.
process
major change rename konsole to something else the name konsole sits too closely to kde konsole which is confusing so this bug is a few parts remove all references of the name konsole from class method and variable names e g konsoleapp session and konsolevarof to livevarof once this is done the code itself will no longer care about large refactorings of name changes in the future come up with a new name for the library i m leaning on kotter for kotlin terminal but i m open to other ideas update the package artifacts readme screencasts this repo name our discord channel etc to match this is a pretty big change for this library this may inconvenience users who are using it so i ll give this bug a day or two just in case anyone wants to respond but so far i haven t seen much of an indication that people are actually using it yet so it seems like a good time to make such a drastic change
1
9,892
12,890,404,661
IssuesEvent
2020-07-13 15:57:00
endlessm/azafea
https://api.github.com/repos/endlessm/azafea
closed
Optimize some queries
endless event processors
Here is a dump of queries I've been asked to run on activation/ping data. Real-world queries provide good insight into what should be optimized and how (adding indexes, splitting columns, adding pre-processed tables, …) Let's make good use of PostgreSQL's `EXPLAIN` on those! --- * Countries with the most new OEM activations over the past 365 days: ``` SELECT count(id) AS count, country FROM activation_v1 WHERE image LIKE 'eosoem-%' AND created_at >= NOW() - INTERVAL '365 DAYS' GROUP BY country ORDER BY count DESC LIMIT 10; ``` * Countries with the most active OEM installations over the past 365 days, limited to machines which have sent 8 pings or more: ``` SELECT count(ping_v1.id) AS count, ping_v1.country FROM ping_v1 JOIN ping_configuration_v1 ON ping_v1.config_id = ping_configuration_v1.id WHERE ping_configuration_v1.image LIKE 'eosoem-%' AND ping_v1.count >= 8 AND ping_v1.created_at >= NOW() - INTERVAL '365 DAYS' GROUP BY ping_v1.country ORDER BY count DESC LIMIT 10; ```
1.0
Optimize some queries - Here is a dump of queries I've been asked to run on activation/ping data. Real-world queries provide good insight into what should be optimized and how (adding indexes, splitting columns, adding pre-processed tables, …) Let's make good use of PostgreSQL's `EXPLAIN` on those! --- * Countries with the most new OEM activations over the past 365 days: ``` SELECT count(id) AS count, country FROM activation_v1 WHERE image LIKE 'eosoem-%' AND created_at >= NOW() - INTERVAL '365 DAYS' GROUP BY country ORDER BY count DESC LIMIT 10; ``` * Countries with the most active OEM installations over the past 365 days, limited to machines which have sent 8 pings or more: ``` SELECT count(ping_v1.id) AS count, ping_v1.country FROM ping_v1 JOIN ping_configuration_v1 ON ping_v1.config_id = ping_configuration_v1.id WHERE ping_configuration_v1.image LIKE 'eosoem-%' AND ping_v1.count >= 8 AND ping_v1.created_at >= NOW() - INTERVAL '365 DAYS' GROUP BY ping_v1.country ORDER BY count DESC LIMIT 10; ```
process
optimize some queries here is a dump of queries i ve been asked to run on activation ping data real world queries provide good insight into what should be optimized and how adding indexes splitting columns adding pre processed tables … let s make good use of postgresql s explain on those countries with the most new oem activations over the past days select count id as count country from activation where image like eosoem and created at now interval days group by country order by count desc limit countries with the most active oem installations over the past days limited to machines which have sent pings or more select count ping id as count ping country from ping join ping configuration on ping config id ping configuration id where ping configuration image like eosoem and ping count and ping created at now interval days group by ping country order by count desc limit
1
186
2,589,765,112
IssuesEvent
2015-02-18 15:03:43
tinkerpop/tinkerpop3
https://api.github.com/repos/tinkerpop/tinkerpop3
closed
Provide __(object) as the "inject" shortcut for traversals emanating from non-graph objects.
enhancement process
Make the below a reality.... WDYT @dkuppitz ? ```groovy gremlin> m = [a:2,b:1,c:3] ==>a=2 ==>b=1 ==>c=3 gremlin> __(m) ==>[a:2, b:1, c:3] gremlin> __(m).order(local).by(valueDecr) ==>[c:3, a:2, b:1] gremlin> __(m).order(local).by(keyIncr) ==>[a:2, b:1, c:3] gremlin> __(m).order(local).by(shuffle) ==>[c:3, a:2, b:1] ```
1.0
Provide __(object) as the "inject" shortcut for traversals emanating from non-graph objects. - Make the below a reality.... WDYT @dkuppitz ? ```groovy gremlin> m = [a:2,b:1,c:3] ==>a=2 ==>b=1 ==>c=3 gremlin> __(m) ==>[a:2, b:1, c:3] gremlin> __(m).order(local).by(valueDecr) ==>[c:3, a:2, b:1] gremlin> __(m).order(local).by(keyIncr) ==>[a:2, b:1, c:3] gremlin> __(m).order(local).by(shuffle) ==>[c:3, a:2, b:1] ```
process
provide object as the inject shortcut for traversals emanating from non graph objects make the below a reality wdyt dkuppitz groovy gremlin m a b c gremlin m gremlin m order local by valuedecr gremlin m order local by keyincr gremlin m order local by shuffle
1
9,724
3,963,992,951
IssuesEvent
2016-05-02 22:32:10
teotidev/guide
https://api.github.com/repos/teotidev/guide
closed
Export - as Sounds
code work
- checkbox list holding machines - TextInput for prefix on file names ([TemplateName] for default) - TextInput for postfix on file names ("" for default) - screen footer action items - Select All, Deselect All, Export Figure out away to let the user edit individual machine names as well. Plus, there is going to have to be a metadata editor that can edit the individual machine metadata to allow for pro kits exported this way
1.0
Export - as Sounds - - checkbox list holding machines - TextInput for prefix on file names ([TemplateName] for default) - TextInput for postfix on file names ("" for default) - screen footer action items - Select All, Deselect All, Export Figure out away to let the user edit individual machine names as well. Plus, there is going to have to be a metadata editor that can edit the individual machine metadata to allow for pro kits exported this way
non_process
export as sounds checkbox list holding machines textinput for prefix on file names for default textinput for postfix on file names for default screen footer action items select all deselect all export figure out away to let the user edit individual machine names as well plus there is going to have to be a metadata editor that can edit the individual machine metadata to allow for pro kits exported this way
0
16,073
20,248,549,871
IssuesEvent
2022-02-14 15:48:37
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Documentation issue: Configuring C toolchains Fails to build
P4 type: support / not a bug (process) team-Rules-CPP
Opened issue by mistake, trying to follow another tutorial: https://docs.bazel.build/versions/4.2.0/tutorial/cpp.html Will see if I have any luck with this one. Tried the cross compile arm toolchain tutorial and got this output: jb@jb-VirtualBox-dev:~/bazel-tutorial/examples/cpp/main$ bazel build --config=clang_config //main:hello-world INFO: Analyzed target //main:hello-world (0 packages loaded, 0 targets configured). INFO: Found 1 target... ERROR: /home/jb/bazel-tutorial/examples/cpp/main/BUILD:1:10: Linking main/hello-world failed: (Exit 1): clang failed: error executing command /usr/bin/clang -o bazel-out/k8-fastbuild/bin/main/hello-world bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o -Wl,-S Use --sandbox_debug to see verbose messages from the sandbox /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `hello::HelloLib::HelloLib(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)': hello-lib.cc:(.text+0x21): undefined reference to `operator new(unsigned long)' /usr/bin/ld: hello-lib.cc:(.text+0x3b): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0x63): undefined reference to `operator delete(void*)' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `hello::HelloLib::greet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)': hello-lib.cc:(.text+0x8b): undefined reference to `std::cout' /usr/bin/ld: hello-lib.cc:(.text+0x97): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <char, std::char_traits<char>, std::allocator<char> >(std::basic_ostream<char, std::char_traits<char> >&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0xa9): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <std::char_traits<char> >(std::basic_ostream<char, std::char_traits<char> >&, char const*)' /usr/bin/ld: hello-lib.cc:(.text+0xb5): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <char, std::char_traits<char>, std::allocator<char> >(std::basic_ostream<char, std::char_traits<char> >&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0xbe): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::endl<char, std::char_traits<char> >(std::basic_ostream<char, std::char_traits<char> >&)' /usr/bin/ld: hello-lib.cc:(.text+0xc7): undefined reference to `std::ostream::operator<<(std::ostream& (*)(std::ostream&))' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `__cxx_global_var_init': hello-lib.cc:(.text.startup+0xf): undefined reference to `std::ios_base::Init::Init()' /usr/bin/ld: hello-lib.cc:(.text.startup+0x15): undefined reference to `std::ios_base::Init::~Init()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o:(.eh_frame+0x2ab): undefined reference to `__gxx_personality_v0' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `main': hello-world.cc:(.text+0x25): undefined reference to `std::allocator<char>::allocator()' /usr/bin/ld: hello-world.cc:(.text+0x37): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(char const*, std::allocator<char> const&)' /usr/bin/ld: hello-world.cc:(.text+0x57): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0x60): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0x73): undefined reference to `std::allocator<char>::allocator()' /usr/bin/ld: hello-world.cc:(.text+0x88): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(char const*, std::allocator<char> const&)' /usr/bin/ld: hello-world.cc:(.text+0x96): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0xb1): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::operator=(char const*)' /usr/bin/ld: hello-world.cc:(.text+0xd7): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0xe0): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0xf5): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0x10a): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0x131): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `__clang_call_terminate': hello-world.cc:(.text.__clang_call_terminate[__clang_call_terminate]+0x2): undefined reference to `__cxa_begin_catch' /usr/bin/ld: hello-world.cc:(.text.__clang_call_terminate[__clang_call_terminate]+0xb): undefined reference to `std::terminate()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `std::default_delete<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const>::operator()(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const*) const': hello-world.cc:(.text._ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_[_ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_]+0x27): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text._ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_[_ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_]+0x33): undefined reference to `operator delete(void*)' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o:(.eh_frame+0x1cb): undefined reference to `__gxx_personality_v0' clang: error: linker command failed with exit code 1 (use -v to see invocation) Target //main:hello-world failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 0.709s, Critical Path: 0.34s INFO: 3 processes: 2 internal, 1 linux-sandbox. FAILED: Build did NOT complete successfully
1.0
Documentation issue: Configuring C toolchains Fails to build - Opened issue by mistake, trying to follow another tutorial: https://docs.bazel.build/versions/4.2.0/tutorial/cpp.html Will see if I have any luck with this one. Tried the cross compile arm toolchain tutorial and got this output: jb@jb-VirtualBox-dev:~/bazel-tutorial/examples/cpp/main$ bazel build --config=clang_config //main:hello-world INFO: Analyzed target //main:hello-world (0 packages loaded, 0 targets configured). INFO: Found 1 target... ERROR: /home/jb/bazel-tutorial/examples/cpp/main/BUILD:1:10: Linking main/hello-world failed: (Exit 1): clang failed: error executing command /usr/bin/clang -o bazel-out/k8-fastbuild/bin/main/hello-world bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o -Wl,-S Use --sandbox_debug to see verbose messages from the sandbox /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `hello::HelloLib::HelloLib(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)': hello-lib.cc:(.text+0x21): undefined reference to `operator new(unsigned long)' /usr/bin/ld: hello-lib.cc:(.text+0x3b): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0x63): undefined reference to `operator delete(void*)' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `hello::HelloLib::greet(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)': hello-lib.cc:(.text+0x8b): undefined reference to `std::cout' /usr/bin/ld: hello-lib.cc:(.text+0x97): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <char, std::char_traits<char>, std::allocator<char> >(std::basic_ostream<char, std::char_traits<char> >&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0xa9): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <std::char_traits<char> >(std::basic_ostream<char, std::char_traits<char> >&, char const*)' /usr/bin/ld: hello-lib.cc:(.text+0xb5): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::operator<< <char, std::char_traits<char>, std::allocator<char> >(std::basic_ostream<char, std::char_traits<char> >&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)' /usr/bin/ld: hello-lib.cc:(.text+0xbe): undefined reference to `std::basic_ostream<char, std::char_traits<char> >& std::endl<char, std::char_traits<char> >(std::basic_ostream<char, std::char_traits<char> >&)' /usr/bin/ld: hello-lib.cc:(.text+0xc7): undefined reference to `std::ostream::operator<<(std::ostream& (*)(std::ostream&))' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o: in function `__cxx_global_var_init': hello-lib.cc:(.text.startup+0xf): undefined reference to `std::ios_base::Init::Init()' /usr/bin/ld: hello-lib.cc:(.text.startup+0x15): undefined reference to `std::ios_base::Init::~Init()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-lib.o:(.eh_frame+0x2ab): undefined reference to `__gxx_personality_v0' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `main': hello-world.cc:(.text+0x25): undefined reference to `std::allocator<char>::allocator()' /usr/bin/ld: hello-world.cc:(.text+0x37): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(char const*, std::allocator<char> const&)' /usr/bin/ld: hello-world.cc:(.text+0x57): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0x60): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0x73): undefined reference to `std::allocator<char>::allocator()' /usr/bin/ld: hello-world.cc:(.text+0x88): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(char const*, std::allocator<char> const&)' /usr/bin/ld: hello-world.cc:(.text+0x96): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0xb1): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::operator=(char const*)' /usr/bin/ld: hello-world.cc:(.text+0xd7): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0xe0): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0xf5): undefined reference to `std::allocator<char>::~allocator()' /usr/bin/ld: hello-world.cc:(.text+0x10a): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text+0x131): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `__clang_call_terminate': hello-world.cc:(.text.__clang_call_terminate[__clang_call_terminate]+0x2): undefined reference to `__cxa_begin_catch' /usr/bin/ld: hello-world.cc:(.text.__clang_call_terminate[__clang_call_terminate]+0xb): undefined reference to `std::terminate()' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o: in function `std::default_delete<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const>::operator()(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const*) const': hello-world.cc:(.text._ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_[_ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_]+0x27): undefined reference to `std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >::~basic_string()' /usr/bin/ld: hello-world.cc:(.text._ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_[_ZNKSt14default_deleteIKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEclEPS6_]+0x33): undefined reference to `operator delete(void*)' /usr/bin/ld: bazel-out/k8-fastbuild/bin/main/_objs/hello-world/hello-world.o:(.eh_frame+0x1cb): undefined reference to `__gxx_personality_v0' clang: error: linker command failed with exit code 1 (use -v to see invocation) Target //main:hello-world failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 0.709s, Critical Path: 0.34s INFO: 3 processes: 2 internal, 1 linux-sandbox. FAILED: Build did NOT complete successfully
process
documentation issue configuring c toolchains fails to build opened issue by mistake trying to follow another tutorial will see if i have any luck with this one tried the cross compile arm toolchain tutorial and got this output jb jb virtualbox dev bazel tutorial examples cpp main bazel build config clang config main hello world info analyzed target main hello world packages loaded targets configured info found target error home jb bazel tutorial examples cpp main build linking main hello world failed exit clang failed error executing command usr bin clang o bazel out fastbuild bin main hello world bazel out fastbuild bin main objs hello world hello world o bazel out fastbuild bin main objs hello world hello lib o wl s use sandbox debug to see verbose messages from the sandbox usr bin ld bazel out fastbuild bin main objs hello world hello lib o in function hello hellolib hellolib std basic string std allocator const hello lib cc text undefined reference to operator new unsigned long usr bin ld hello lib cc text undefined reference to std basic string std allocator basic string std basic string std allocator const usr bin ld hello lib cc text undefined reference to operator delete void usr bin ld bazel out fastbuild bin main objs hello world hello lib o in function hello hellolib greet std basic string std allocator const hello lib cc text undefined reference to std cout usr bin ld hello lib cc text undefined reference to std basic ostream std operator std allocator std basic ostream std basic string std allocator const usr bin ld hello lib cc text undefined reference to std basic ostream std operator std basic ostream char const usr bin ld hello lib cc text undefined reference to std basic ostream std operator std allocator std basic ostream std basic string std allocator const usr bin ld hello lib cc text undefined reference to std basic ostream std endl std basic ostream usr bin ld hello lib cc text undefined reference to std ostream operator std ostream std ostream usr bin ld bazel out fastbuild bin main objs hello world hello lib o in function cxx global var init hello lib cc text startup undefined reference to std ios base init init usr bin ld hello lib cc text startup undefined reference to std ios base init init usr bin ld bazel out fastbuild bin main objs hello world hello lib o eh frame undefined reference to gxx personality usr bin ld bazel out fastbuild bin main objs hello world hello world o in function main hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std basic string std allocator basic string char const std allocator const usr bin ld hello world cc text undefined reference to std basic string std allocator basic string usr bin ld hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std basic string std allocator basic string char const std allocator const usr bin ld hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std basic string std allocator operator char const usr bin ld hello world cc text undefined reference to std basic string std allocator basic string usr bin ld hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std allocator allocator usr bin ld hello world cc text undefined reference to std basic string std allocator basic string usr bin ld hello world cc text undefined reference to std basic string std allocator basic string usr bin ld bazel out fastbuild bin main objs hello world hello world o in function clang call terminate hello world cc text clang call terminate undefined reference to cxa begin catch usr bin ld hello world cc text clang call terminate undefined reference to std terminate usr bin ld bazel out fastbuild bin main objs hello world hello world o in function std default delete std allocator const operator std basic string std allocator const const hello world cc text undefined reference to std basic string std allocator basic string usr bin ld hello world cc text undefined reference to operator delete void usr bin ld bazel out fastbuild bin main objs hello world hello world o eh frame undefined reference to gxx personality clang error linker command failed with exit code use v to see invocation target main hello world failed to build use verbose failures to see the command lines of failed build steps info elapsed time critical path info processes internal linux sandbox failed build did not complete successfully
1
18,236
24,302,436,742
IssuesEvent
2022-09-29 14:47:05
celo-org/celo-monorepo
https://api.github.com/repos/celo-org/celo-monorepo
closed
Catch src import issues on pre-commit hook
Priority: P3 release-process Applications stale
### Expected Behavior Sometimes the IDE can trick you and auto-import a `@celo` dependency from the `src` folder instead of the `lib` folder. For example, in ContractKit we had this in a file: ```typescript import { isValidAddress } from '@celo/utils/src/address' ``` Nothing in CI complains about this, but it continues to reference `src` after it's published to NPM which causes a runtime error as the `src` folder isn't published to NPM. See #7073 This can cause sizeable disruption when it comes to the NPM publish cycle. But it's easy to catch earlier, potentially in a pre-commit hook by looking at `.ts` files for `/@celo.*src/` this regexp, or something of that nature. ### Current Behavior Everything works until you try to use the package from NPM :(
1.0
Catch src import issues on pre-commit hook - ### Expected Behavior Sometimes the IDE can trick you and auto-import a `@celo` dependency from the `src` folder instead of the `lib` folder. For example, in ContractKit we had this in a file: ```typescript import { isValidAddress } from '@celo/utils/src/address' ``` Nothing in CI complains about this, but it continues to reference `src` after it's published to NPM which causes a runtime error as the `src` folder isn't published to NPM. See #7073 This can cause sizeable disruption when it comes to the NPM publish cycle. But it's easy to catch earlier, potentially in a pre-commit hook by looking at `.ts` files for `/@celo.*src/` this regexp, or something of that nature. ### Current Behavior Everything works until you try to use the package from NPM :(
process
catch src import issues on pre commit hook expected behavior sometimes the ide can trick you and auto import a celo dependency from the src folder instead of the lib folder for example in contractkit we had this in a file typescript import isvalidaddress from celo utils src address nothing in ci complains about this but it continues to reference src after it s published to npm which causes a runtime error as the src folder isn t published to npm see this can cause sizeable disruption when it comes to the npm publish cycle but it s easy to catch earlier potentially in a pre commit hook by looking at ts files for celo src this regexp or something of that nature current behavior everything works until you try to use the package from npm
1
17,530
23,341,423,477
IssuesEvent
2022-08-09 14:19:04
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
closed
Bump local node versions to alpha and release 0.29.0 candidates
enhancement P2 process
### Problem Local node images are using old versions of services and mirror node ### Solution Update - services -> 0.29.0-alpha.1 - mirror node -> 0.62.0-rc1 ### Alternatives _No response_
1.0
Bump local node versions to alpha and release 0.29.0 candidates - ### Problem Local node images are using old versions of services and mirror node ### Solution Update - services -> 0.29.0-alpha.1 - mirror node -> 0.62.0-rc1 ### Alternatives _No response_
process
bump local node versions to alpha and release candidates problem local node images are using old versions of services and mirror node solution update services alpha mirror node alternatives no response
1
71,935
13,764,352,036
IssuesEvent
2020-10-07 11:57:08
GoogleCloudPlatform/spring-cloud-gcp
https://api.github.com/repos/GoogleCloudPlatform/spring-cloud-gcp
opened
Update GHActions to remove vuln warning
P3 code-quality dependencies
Our GHActions are spitting out warnings about `add-path` and `set-env`: > The `add-path` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/ The fix may be as easy as just bumping the versions of the steps we're using, but needs some investigation on the Environment Files feature.
1.0
Update GHActions to remove vuln warning - Our GHActions are spitting out warnings about `add-path` and `set-env`: > The `add-path` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/ The fix may be as easy as just bumping the versions of the steps we're using, but needs some investigation on the Environment Files feature.
non_process
update ghactions to remove vuln warning our ghactions are spitting out warnings about add path and set env the add path command is deprecated and will be disabled soon please upgrade to using environment files for more information see the fix may be as easy as just bumping the versions of the steps we re using but needs some investigation on the environment files feature
0
6,043
8,854,307,488
IssuesEvent
2019-01-09 00:44:32
NottingHack/hms2
https://api.github.com/repos/NottingHack/hms2
opened
member.payment audit
Process
If it's been 2 weeks and we have not seen a payment remind the user After 2 months there data should be deleted After 6 months there data should be purged
1.0
member.payment audit - If it's been 2 weeks and we have not seen a payment remind the user After 2 months there data should be deleted After 6 months there data should be purged
process
member payment audit if it s been weeks and we have not seen a payment remind the user after months there data should be deleted after months there data should be purged
1
380,655
11,269,269,124
IssuesEvent
2020-01-14 08:32:00
grpc/grpc
https://api.github.com/repos/grpc/grpc
closed
Static build with OpenSSL 1.1.1+ on Windows
kind/bug priority/P2
This is duplicate of https://github.com/grpc/grpc/issues/12231 which has been closed as stale and in which I cannot comment. <!-- This form is for bug reports and feature requests ONLY! For general questions and troubleshooting, please ask/look for answers here: - grpc.io mailing list: https://groups.google.com/forum/#!forum/grpc-io - StackOverflow, with "grpc" tag: https://stackoverflow.com/questions/tagged/grpc Issues specific to *grpc-java*, *grpc-go*, *grpc-node*, *grpc-dart*, *grpc-web* should be created in the repository they belong to (e.g. https://github.com/grpc/grpc-LANGUAGE/issues/new) --> ### What version of gRPC and what language are you using? C++ 1.24/1.25 ### What operating system (Linux, Windows,...) and version? Windows 10 ### What runtime / compiler are you using (e.g. python version or version of gcc) Visual Studio 2015 ### What did you do? Compiled static OpenSSL and attempted to compile static gRPC. ### What did you expect to see? Successful build ### What did you see instead? Undefined references related to symbols in Microsoft's crypt32.lib. Unfortunately I don't have the build log with me currently. ### Anything else we should know about your project / environment? The build I've configured attempts to mimic the vcpkg build - vcpkg works around this issue by patching the CMakeLists file with the following patch: https://github.com/microsoft/vcpkg/blob/master/ports/grpc/00006-crypt32.patch (I'm a bit disappointed to see no attempt on vcpkg's behalf to upstream any of the patches in vcpkg.) I've now built gRPC from my own fork of 1.25.0 successfully by having the patch applied on top of the v1.25.0 branch: https://github.com/Rantanen/grpc/commits/v1.25.0-mfiles Would gRPC be interested in upstreaming this change? I can prepare a merge request if that's the case. I know there's some shift in focus onto Bazel these days, but I haven't found a way to create a static gRPC archive with Bazel (not that I looked for one _that_ much). Our product toolchain is based on Visual Studio, so currently we are compiling gRPC as static archive and including that in the VS builds. Also I'm not sure how testable such MR would be, given - as far as I know - the CI doesn't compile static libraries or link against OpenSSL (1.1.x).
1.0
Static build with OpenSSL 1.1.1+ on Windows - This is duplicate of https://github.com/grpc/grpc/issues/12231 which has been closed as stale and in which I cannot comment. <!-- This form is for bug reports and feature requests ONLY! For general questions and troubleshooting, please ask/look for answers here: - grpc.io mailing list: https://groups.google.com/forum/#!forum/grpc-io - StackOverflow, with "grpc" tag: https://stackoverflow.com/questions/tagged/grpc Issues specific to *grpc-java*, *grpc-go*, *grpc-node*, *grpc-dart*, *grpc-web* should be created in the repository they belong to (e.g. https://github.com/grpc/grpc-LANGUAGE/issues/new) --> ### What version of gRPC and what language are you using? C++ 1.24/1.25 ### What operating system (Linux, Windows,...) and version? Windows 10 ### What runtime / compiler are you using (e.g. python version or version of gcc) Visual Studio 2015 ### What did you do? Compiled static OpenSSL and attempted to compile static gRPC. ### What did you expect to see? Successful build ### What did you see instead? Undefined references related to symbols in Microsoft's crypt32.lib. Unfortunately I don't have the build log with me currently. ### Anything else we should know about your project / environment? The build I've configured attempts to mimic the vcpkg build - vcpkg works around this issue by patching the CMakeLists file with the following patch: https://github.com/microsoft/vcpkg/blob/master/ports/grpc/00006-crypt32.patch (I'm a bit disappointed to see no attempt on vcpkg's behalf to upstream any of the patches in vcpkg.) I've now built gRPC from my own fork of 1.25.0 successfully by having the patch applied on top of the v1.25.0 branch: https://github.com/Rantanen/grpc/commits/v1.25.0-mfiles Would gRPC be interested in upstreaming this change? I can prepare a merge request if that's the case. I know there's some shift in focus onto Bazel these days, but I haven't found a way to create a static gRPC archive with Bazel (not that I looked for one _that_ much). Our product toolchain is based on Visual Studio, so currently we are compiling gRPC as static archive and including that in the VS builds. Also I'm not sure how testable such MR would be, given - as far as I know - the CI doesn't compile static libraries or link against OpenSSL (1.1.x).
non_process
static build with openssl on windows this is duplicate of which has been closed as stale and in which i cannot comment this form is for bug reports and feature requests only for general questions and troubleshooting please ask look for answers here grpc io mailing list stackoverflow with grpc tag issues specific to grpc java grpc go grpc node grpc dart grpc web should be created in the repository they belong to e g what version of grpc and what language are you using c what operating system linux windows and version windows what runtime compiler are you using e g python version or version of gcc visual studio what did you do compiled static openssl and attempted to compile static grpc what did you expect to see successful build what did you see instead undefined references related to symbols in microsoft s lib unfortunately i don t have the build log with me currently anything else we should know about your project environment the build i ve configured attempts to mimic the vcpkg build vcpkg works around this issue by patching the cmakelists file with the following patch i m a bit disappointed to see no attempt on vcpkg s behalf to upstream any of the patches in vcpkg i ve now built grpc from my own fork of successfully by having the patch applied on top of the branch would grpc be interested in upstreaming this change i can prepare a merge request if that s the case i know there s some shift in focus onto bazel these days but i haven t found a way to create a static grpc archive with bazel not that i looked for one that much our product toolchain is based on visual studio so currently we are compiling grpc as static archive and including that in the vs builds also i m not sure how testable such mr would be given as far as i know the ci doesn t compile static libraries or link against openssl x
0
21,149
28,127,252,884
IssuesEvent
2023-03-31 18:54:08
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
[processor/metrictransform] Aggregating labels drops metric description
bug help wanted priority:p2 processor/metricstransform
### What happened? ## Description When aggregating by labels, the metric description is made empty. See this function: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/d5bb86ce9ec34a62a6959ed22cc5bd61a6611215/processor/metricstransformprocessor/metrics_transform_processor_otlp.go#L454-L470 The metric description is not copied over to the new metric, which causes the description of the metric to be dropped. ## Steps to Reproduce Run with attached config. Observe output metric descriptions are empty. ## Expected Result Metric description is retained on the metric with aggregated labels ## Actual Result Metric description is dropped on the metric with aggregated labels ### Collector version v0.60.0-47-g0bf2d8429d ### Environment information _No response_ ### OpenTelemetry Collector configuration ```yaml receivers: hostmetrics: collection_interval: 5s scrapers: filesystem: processors: metricstransform: transforms: # Keep only direction/request metrics - include: ^system.filesystem match_type: regexp action: update operations: - action: aggregate_labels label_set: [ device, mode, mountpoint, type] aggregation_type: sum exporters: logging: loglevel: debug service: pipelines: metrics: receivers: [hostmetrics] processors: [metricstransform] exporters: [logging] ``` ### Log output _No response_ ### Additional context _No response_
1.0
[processor/metrictransform] Aggregating labels drops metric description - ### What happened? ## Description When aggregating by labels, the metric description is made empty. See this function: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/d5bb86ce9ec34a62a6959ed22cc5bd61a6611215/processor/metricstransformprocessor/metrics_transform_processor_otlp.go#L454-L470 The metric description is not copied over to the new metric, which causes the description of the metric to be dropped. ## Steps to Reproduce Run with attached config. Observe output metric descriptions are empty. ## Expected Result Metric description is retained on the metric with aggregated labels ## Actual Result Metric description is dropped on the metric with aggregated labels ### Collector version v0.60.0-47-g0bf2d8429d ### Environment information _No response_ ### OpenTelemetry Collector configuration ```yaml receivers: hostmetrics: collection_interval: 5s scrapers: filesystem: processors: metricstransform: transforms: # Keep only direction/request metrics - include: ^system.filesystem match_type: regexp action: update operations: - action: aggregate_labels label_set: [ device, mode, mountpoint, type] aggregation_type: sum exporters: logging: loglevel: debug service: pipelines: metrics: receivers: [hostmetrics] processors: [metricstransform] exporters: [logging] ``` ### Log output _No response_ ### Additional context _No response_
process
aggregating labels drops metric description what happened description when aggregating by labels the metric description is made empty see this function the metric description is not copied over to the new metric which causes the description of the metric to be dropped steps to reproduce run with attached config observe output metric descriptions are empty expected result metric description is retained on the metric with aggregated labels actual result metric description is dropped on the metric with aggregated labels collector version environment information no response opentelemetry collector configuration yaml receivers hostmetrics collection interval scrapers filesystem processors metricstransform transforms keep only direction request metrics include system filesystem match type regexp action update operations action aggregate labels label set aggregation type sum exporters logging loglevel debug service pipelines metrics receivers processors exporters log output no response additional context no response
1
67,003
7,031,572,660
IssuesEvent
2017-12-26 18:50:19
LeopoldArkham/Molten
https://api.github.com/repos/LeopoldArkham/Molten
closed
Add tests for, and fix indexing
API bug Testing
There should be a test suite for indexing all indexable values. Indexing was broken by #15 so it will need fixing to make the tests green!
1.0
Add tests for, and fix indexing - There should be a test suite for indexing all indexable values. Indexing was broken by #15 so it will need fixing to make the tests green!
non_process
add tests for and fix indexing there should be a test suite for indexing all indexable values indexing was broken by so it will need fixing to make the tests green
0
10,008
13,043,862,967
IssuesEvent
2020-07-29 02:52:56
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `Right` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `Right` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `Right` from TiDB - ## Description Port the scalar function `Right` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function right from tidb description port the scalar function right from tidb to coprocessor score mentor s lonng recommended skills rust programming learning materials already implemented expressions ported from tidb
1
187
2,589,869,146
IssuesEvent
2015-02-18 15:39:06
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
opened
Implement System.Diagnostics.Process for Mac
cross platform feature approved System.Diagnostics.Process up for grabs
Process' implementation is currently Linux-specific, relying heavily on procfs to gather information on processes. To my knowledge, procfs doesn't exist on OS X, so we'll need a specialized implementation for Mac.
1.0
Implement System.Diagnostics.Process for Mac - Process' implementation is currently Linux-specific, relying heavily on procfs to gather information on processes. To my knowledge, procfs doesn't exist on OS X, so we'll need a specialized implementation for Mac.
process
implement system diagnostics process for mac process implementation is currently linux specific relying heavily on procfs to gather information on processes to my knowledge procfs doesn t exist on os x so we ll need a specialized implementation for mac
1
645,188
20,997,411,038
IssuesEvent
2022-03-29 14:32:48
kubernetes-sigs/cluster-api-provider-aws
https://api.github.com/repos/kubernetes-sigs/cluster-api-provider-aws
closed
Remove eks controlplane and bootstrap providers from clusterctl
kind/feature priority/important-soon area/provider/eks triage/accepted
/kind feature /area provider/eks /priority important-soon /milestone v0.7.x **Describe the solution you'd like** As a result of the graduation of EKS (#2570) the controlplane and bootstrap providers have been removed and the controllers incorporated in the main infrastructure provider. As a result we need to remove the `aws-eks` controlplane and bootstrap providers from clusterctl (originally added in #1970). **Anything else you would like to add:** At the same time i will update the quick start docs to point to the CAPA docs that detail disabling EKS support.
1.0
Remove eks controlplane and bootstrap providers from clusterctl - /kind feature /area provider/eks /priority important-soon /milestone v0.7.x **Describe the solution you'd like** As a result of the graduation of EKS (#2570) the controlplane and bootstrap providers have been removed and the controllers incorporated in the main infrastructure provider. As a result we need to remove the `aws-eks` controlplane and bootstrap providers from clusterctl (originally added in #1970). **Anything else you would like to add:** At the same time i will update the quick start docs to point to the CAPA docs that detail disabling EKS support.
non_process
remove eks controlplane and bootstrap providers from clusterctl kind feature area provider eks priority important soon milestone x describe the solution you d like as a result of the graduation of eks the controlplane and bootstrap providers have been removed and the controllers incorporated in the main infrastructure provider as a result we need to remove the aws eks controlplane and bootstrap providers from clusterctl originally added in anything else you would like to add at the same time i will update the quick start docs to point to the capa docs that detail disabling eks support
0
14,263
17,202,749,233
IssuesEvent
2021-07-17 15:48:08
ltechkorea/mlperf-inference
https://api.github.com/repos/ltechkorea/mlperf-inference
closed
Implement Generating Submission Automation using `submission_runner`
Recommendation enhancement image classification medical imaging natural language processing object detection pre-submit speech to text
`submission_runner`를 사용하여 자동화를 구현한다.
1.0
Implement Generating Submission Automation using `submission_runner` - `submission_runner`를 사용하여 자동화를 구현한다.
process
implement generating submission automation using submission runner submission runner 를 사용하여 자동화를 구현한다
1
302,792
9,292,550,684
IssuesEvent
2019-03-22 03:42:01
BentoBoxWorld/BentoBox
https://api.github.com/repos/BentoBoxWorld/BentoBox
closed
Island expel implementation
Priority: Low Status: In progress Type: Enhancement
**Is your feature request related to a problem?** - Not that we know of. **Describe the solution you'd like** - Our players would love to see /is expel implemented soon. Purpose of this is already familiar to ASkyBlock users; it temporarily ejects a player from your island. - As setspawn now exists, maybe there should be an option to choose whether the expelled players should be teleported to spawn or their island spawnpoint (If they have one). Though what would it do if no spawn is set and the player does not belong to an island? **Additional context** - Bumping here: https://github.com/BentoBoxWorld/BentoBox/wiki/Completed-Items requires update.
1.0
Island expel implementation - **Is your feature request related to a problem?** - Not that we know of. **Describe the solution you'd like** - Our players would love to see /is expel implemented soon. Purpose of this is already familiar to ASkyBlock users; it temporarily ejects a player from your island. - As setspawn now exists, maybe there should be an option to choose whether the expelled players should be teleported to spawn or their island spawnpoint (If they have one). Though what would it do if no spawn is set and the player does not belong to an island? **Additional context** - Bumping here: https://github.com/BentoBoxWorld/BentoBox/wiki/Completed-Items requires update.
non_process
island expel implementation is your feature request related to a problem not that we know of describe the solution you d like our players would love to see is expel implemented soon purpose of this is already familiar to askyblock users it temporarily ejects a player from your island as setspawn now exists maybe there should be an option to choose whether the expelled players should be teleported to spawn or their island spawnpoint if they have one though what would it do if no spawn is set and the player does not belong to an island additional context bumping here requires update
0
27,510
13,261,673,809
IssuesEvent
2020-08-20 20:19:47
dotnet/msbuild
https://api.github.com/repos/dotnet/msbuild
closed
CopyOnWriteDictionary should just wrap ImmutableDictionary
Performance-Scenario-Solution-Open performance
In our tests on large solutions that have .NET Core projects in them, ProjectInstance objects used as snapshots can end up taking up to 10% of the managed heap (or more). Almost half of that is CopyOnWritePropertyDictionary<ProjectMetadataInstance> instances. (For example, a sample 1000 project solution has 300k instances.) The current CopyOnWriteDictionary is not as efficient as ImmutableDictionary, moving it to just wrap ImmutableDictionary will create significant memory savings.
True
CopyOnWriteDictionary should just wrap ImmutableDictionary - In our tests on large solutions that have .NET Core projects in them, ProjectInstance objects used as snapshots can end up taking up to 10% of the managed heap (or more). Almost half of that is CopyOnWritePropertyDictionary<ProjectMetadataInstance> instances. (For example, a sample 1000 project solution has 300k instances.) The current CopyOnWriteDictionary is not as efficient as ImmutableDictionary, moving it to just wrap ImmutableDictionary will create significant memory savings.
non_process
copyonwritedictionary should just wrap immutabledictionary in our tests on large solutions that have net core projects in them projectinstance objects used as snapshots can end up taking up to of the managed heap or more almost half of that is copyonwritepropertydictionary instances for example a sample project solution has instances the current copyonwritedictionary is not as efficient as immutabledictionary moving it to just wrap immutabledictionary will create significant memory savings
0
6,076
8,922,184,134
IssuesEvent
2019-01-21 12:14:50
linnovate/root
https://api.github.com/repos/linnovate/root
opened
Copy paste title problem
2.0.7 Process bug bug
copy some title. paste in every entity. the font is bigger than it's should by. ![image](https://user-images.githubusercontent.com/31100069/51474079-de112700-1d86-11e9-8e15-b84d16fa6c73.png) and in search the names show up with css code. ![image](https://user-images.githubusercontent.com/31100069/51474060-c89bfd00-1d86-11e9-978f-14e7f365a08d.png)
1.0
Copy paste title problem - copy some title. paste in every entity. the font is bigger than it's should by. ![image](https://user-images.githubusercontent.com/31100069/51474079-de112700-1d86-11e9-8e15-b84d16fa6c73.png) and in search the names show up with css code. ![image](https://user-images.githubusercontent.com/31100069/51474060-c89bfd00-1d86-11e9-978f-14e7f365a08d.png)
process
copy paste title problem copy some title paste in every entity the font is bigger than it s should by and in search the names show up with css code
1
10,113
13,044,162,214
IssuesEvent
2020-07-29 03:47:30
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `YearWeekWithMode` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `YearWeekWithMode` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @iosmanthus ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `YearWeekWithMode` from TiDB - ## Description Port the scalar function `YearWeekWithMode` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @iosmanthus ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function yearweekwithmode from tidb description port the scalar function yearweekwithmode from tidb to coprocessor score mentor s iosmanthus recommended skills rust programming learning materials already implemented expressions ported from tidb
1
282,889
24,502,869,681
IssuesEvent
2022-10-10 14:06:07
WPChill/download-monitor
https://api.github.com/repos/WPChill/download-monitor
closed
Can't add proeducts to cart
Bug needs testing
When adding products to cart an error is triggered: `Uncaught TypeError: can't access property "split", responseHeaders['content-disposition'] is undefined` ![image](https://user-images.githubusercontent.com/19348508/194877253-b271e4cc-f335-4713-bad7-0ba8528cf140.png)
1.0
Can't add proeducts to cart - When adding products to cart an error is triggered: `Uncaught TypeError: can't access property "split", responseHeaders['content-disposition'] is undefined` ![image](https://user-images.githubusercontent.com/19348508/194877253-b271e4cc-f335-4713-bad7-0ba8528cf140.png)
non_process
can t add proeducts to cart when adding products to cart an error is triggered uncaught typeerror can t access property split responseheaders is undefined
0
294,089
25,345,098,311
IssuesEvent
2022-11-19 05:09:40
karlavdelgadof/Recipe-app
https://api.github.com/repos/karlavdelgadof/Recipe-app
closed
[2pt] Recipe class Testing
testing
## Recipe class Testing :gear: :memo: :hammer_and_wrench: :man_cook: For this issue, the following need to be addressed :arrow_down: : - [x] Create tests for controller - [x] Create test for requests - [x] Create test for models/validations - [x] Create tests for views
1.0
[2pt] Recipe class Testing - ## Recipe class Testing :gear: :memo: :hammer_and_wrench: :man_cook: For this issue, the following need to be addressed :arrow_down: : - [x] Create tests for controller - [x] Create test for requests - [x] Create test for models/validations - [x] Create tests for views
non_process
recipe class testing recipe class testing gear memo hammer and wrench man cook for this issue the following need to be addressed arrow down create tests for controller create test for requests create test for models validations create tests for views
0
9,101
12,178,648,228
IssuesEvent
2020-04-28 09:21:02
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Clipping raster using mask layer ignores the layer filter
Bug Processing
# Problem * Load a vector layer in QGIS * Filter the vector layer * Load a raster layer. * Clip the raster layer using the vector layer. ![gdal-clip](https://user-images.githubusercontent.com/2510900/80214782-b53ba980-863b-11ea-9226-e1ee68dbfcfa.png) The gdal command shown on the processing dialog is like below ``` gdalwarp -of GTiff -cutline /tmp/provinces.shp -cl provinces -crop_to_cutline -dstalpha -multi -co COMPRESS=DEFLATE -co PREDICTOR=2 -co ZLEVEL=9 /tmp/zaf_ppp_2020_3857.tif /tmp/OUTPUT.tif ``` This would clip the layer based on the extent of the province layer. I would expect it to generate a request like the following ``` gdalwarp -of GTiff -cutline /tmp/provinces.shp -cl provinces -csql "select * from provinces where "PROVNAME" = 'Western Cape' " -crop_to_cutline -dstalpha -multi -co COMPRESS=DEFLATE -co PREDICTOR=2 -co ZLEVEL=9 /tmp/zaf_ppp_2020_3857.tif /tmp/OUTPUT.tif ``` ![qgis-12](https://user-images.githubusercontent.com/2510900/80215329-8bcf4d80-863c-11ea-95cd-2e244557f681.png)
1.0
Clipping raster using mask layer ignores the layer filter - # Problem * Load a vector layer in QGIS * Filter the vector layer * Load a raster layer. * Clip the raster layer using the vector layer. ![gdal-clip](https://user-images.githubusercontent.com/2510900/80214782-b53ba980-863b-11ea-9226-e1ee68dbfcfa.png) The gdal command shown on the processing dialog is like below ``` gdalwarp -of GTiff -cutline /tmp/provinces.shp -cl provinces -crop_to_cutline -dstalpha -multi -co COMPRESS=DEFLATE -co PREDICTOR=2 -co ZLEVEL=9 /tmp/zaf_ppp_2020_3857.tif /tmp/OUTPUT.tif ``` This would clip the layer based on the extent of the province layer. I would expect it to generate a request like the following ``` gdalwarp -of GTiff -cutline /tmp/provinces.shp -cl provinces -csql "select * from provinces where "PROVNAME" = 'Western Cape' " -crop_to_cutline -dstalpha -multi -co COMPRESS=DEFLATE -co PREDICTOR=2 -co ZLEVEL=9 /tmp/zaf_ppp_2020_3857.tif /tmp/OUTPUT.tif ``` ![qgis-12](https://user-images.githubusercontent.com/2510900/80215329-8bcf4d80-863c-11ea-95cd-2e244557f681.png)
process
clipping raster using mask layer ignores the layer filter problem load a vector layer in qgis filter the vector layer load a raster layer clip the raster layer using the vector layer the gdal command shown on the processing dialog is like below gdalwarp of gtiff cutline tmp provinces shp cl provinces crop to cutline dstalpha multi co compress deflate co predictor co zlevel tmp zaf ppp tif tmp output tif this would clip the layer based on the extent of the province layer i would expect it to generate a request like the following gdalwarp of gtiff cutline tmp provinces shp cl provinces csql select from provinces where provname western cape crop to cutline dstalpha multi co compress deflate co predictor co zlevel tmp zaf ppp tif tmp output tif
1
68,165
8,224,769,595
IssuesEvent
2018-09-06 14:32:26
trustlines-network/feedback
https://api.github.com/repos/trustlines-network/feedback
closed
Detailed contact view
design idea
### Basic information: OS (Android / iOS / both): Android 7.0 Version: 1.1.1-dev8 ### Screenshots or mock-ups ### Possible solution The trustline event should have the same design as the events on overview. The checkmarks should be moved away for "review". The (checkmark) does not add any value
1.0
Detailed contact view - ### Basic information: OS (Android / iOS / both): Android 7.0 Version: 1.1.1-dev8 ### Screenshots or mock-ups ### Possible solution The trustline event should have the same design as the events on overview. The checkmarks should be moved away for "review". The (checkmark) does not add any value
non_process
detailed contact view basic information os android ios both android version screenshots or mock ups possible solution the trustline event should have the same design as the events on overview the checkmarks should be moved away for review the checkmark does not add any value
0
136,891
5,289,890,713
IssuesEvent
2017-02-08 18:30:13
timberline-secondary/flex-site
https://api.github.com/repos/timberline-secondary/flex-site
closed
Reset student password if not student homeroom teacher
4. Low Priority
Under the admin section, have a space where a student's password can be reset by someone other than just their homeroom teacher.
1.0
Reset student password if not student homeroom teacher - Under the admin section, have a space where a student's password can be reset by someone other than just their homeroom teacher.
non_process
reset student password if not student homeroom teacher under the admin section have a space where a student s password can be reset by someone other than just their homeroom teacher
0
11,115
13,957,681,876
IssuesEvent
2020-10-24 08:07:33
alexanderkotsev/geoportal
https://api.github.com/repos/alexanderkotsev/geoportal
opened
DE: request for a new harvesting
DE - Germany Geoportal Harvesting process
Dear Geoportal Helpdesk, As mentioned in Roberts Mail from 2020/03/02 we would like to initiate a new push of our metadata records to the EU Geoportal. For this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the Geoportal harvesting &quot;sandbox&quot;, please. Thanks in advance and best regards, Anja (on behalf of Coordination Office SDI Germany)
1.0
DE: request for a new harvesting - Dear Geoportal Helpdesk, As mentioned in Roberts Mail from 2020/03/02 we would like to initiate a new push of our metadata records to the EU Geoportal. For this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the Geoportal harvesting &quot;sandbox&quot;, please. Thanks in advance and best regards, Anja (on behalf of Coordination Office SDI Germany)
process
de request for a new harvesting dear geoportal helpdesk as mentioned in roberts mail from we would like to initiate a new push of our metadata records to the eu geoportal for this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the geoportal harvesting quot sandbox quot please thanks in advance and best regards anja on behalf of coordination office sdi germany
1
189
2,594,252,631
IssuesEvent
2015-02-20 01:08:54
dalehenrich/metacello-work
https://api.github.com/repos/dalehenrich/metacello-work
closed
Confusion when loading ConfigurationOfSeaside3 (which loads baseline) from ConfiguraitonOfMagrite
in process
I have BaselineOfSeaside3 locked in an image. I load Magritte3 which references ConfigurationOfSeaside3, but ConfigurationOfSeaside3 loads BaselineOfSeaside3 and the project registration is pretty confused. The ConfigurationOfSeaside3 uses a baseline to load Seaside, where is the baselineProjectSpec? ``` . -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main .. -> anOrderedCollection( BaselineOfGrease [baseline] from github://GsDevKit/Grease:master/repository, BaselineOfRB [baseline] from github://dale... (class)@ -> MetacelloProjectRegistration (oop)@ -> 246371329 (committed)@ -> true baselineProjectSpec@ -> nil configurationProjectSpec@ -> spec name: 'Seaside3'; versionString: '3.1.3.1'; preLoadDoIt: nil; postLoadDoIt: nil; loads: #('Core' 'Javascript' 'RSS' 'Filesystem' ... loadedInImage@ -> true locked@ -> false mutable@ -> false projectName@ -> 'Seaside3' versionInfo@ -> aMetacelloProjectRegistrationVersionInfo ``` Where's the SHA? ``` . -> aMetacelloProjectRegistrationVersionInfo .. -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main (class)@ -> MetacelloProjectRegistrationVersionInfo (oop)@ -> 247417089 (committed)@ -> true projectVersion@ -> 3.1.3.1 [ConfigurationOfSeaside3] versionString@ -> '3.1.3.1' ``` What the heck is ConfigurationOfMagritte doing in there? ``` . -> spec name: 'Seaside3'; versionString: '3.1.3.1'; preLoadDoIt: nil; postLoadDoIt: nil; loads: #('Core' 'Javascript' 'RSS' 'Filesystem' ... .. -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main (class)@ -> MetacelloMCConfigurationOfProjectSpec (oop)@ -> 247417345 (committed)@ -> true className@ -> 'ConfigurationOfSeaside3' file@ -> nil loader@ -> nil loads@ -> anArray( 'Core', 'Javascript', 'RSS', 'Filesystem', 'Welcome') mutable@ -> false name@ -> 'Seaside3' operator@ -> nil postLoadDoIt@ -> spec value: nil preLoadDoIt@ -> spec value: nil project@ -> ConfigurationOfMagritte3(3.0-baseline [ConfigurationOfMagritte3], 3.0 [ConfigurationOfMagritte3], 3.0.1 [ConfigurationOfMagritte3], 3.0.2 [C... projectPackage@ -> spec name: 'ConfigurationOfSeaside3'; repository: 'http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main'. repositories@ -> spec repository: 'http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main' versionString@ -> '3.1.3.1' ```
1.0
Confusion when loading ConfigurationOfSeaside3 (which loads baseline) from ConfiguraitonOfMagrite - I have BaselineOfSeaside3 locked in an image. I load Magritte3 which references ConfigurationOfSeaside3, but ConfigurationOfSeaside3 loads BaselineOfSeaside3 and the project registration is pretty confused. The ConfigurationOfSeaside3 uses a baseline to load Seaside, where is the baselineProjectSpec? ``` . -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main .. -> anOrderedCollection( BaselineOfGrease [baseline] from github://GsDevKit/Grease:master/repository, BaselineOfRB [baseline] from github://dale... (class)@ -> MetacelloProjectRegistration (oop)@ -> 246371329 (committed)@ -> true baselineProjectSpec@ -> nil configurationProjectSpec@ -> spec name: 'Seaside3'; versionString: '3.1.3.1'; preLoadDoIt: nil; postLoadDoIt: nil; loads: #('Core' 'Javascript' 'RSS' 'Filesystem' ... loadedInImage@ -> true locked@ -> false mutable@ -> false projectName@ -> 'Seaside3' versionInfo@ -> aMetacelloProjectRegistrationVersionInfo ``` Where's the SHA? ``` . -> aMetacelloProjectRegistrationVersionInfo .. -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main (class)@ -> MetacelloProjectRegistrationVersionInfo (oop)@ -> 247417089 (committed)@ -> true projectVersion@ -> 3.1.3.1 [ConfigurationOfSeaside3] versionString@ -> '3.1.3.1' ``` What the heck is ConfigurationOfMagritte doing in there? ``` . -> spec name: 'Seaside3'; versionString: '3.1.3.1'; preLoadDoIt: nil; postLoadDoIt: nil; loads: #('Core' 'Javascript' 'RSS' 'Filesystem' ... .. -> ConfigurationOfSeaside3 3.1.3.1 from http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main (class)@ -> MetacelloMCConfigurationOfProjectSpec (oop)@ -> 247417345 (committed)@ -> true className@ -> 'ConfigurationOfSeaside3' file@ -> nil loader@ -> nil loads@ -> anArray( 'Core', 'Javascript', 'RSS', 'Filesystem', 'Welcome') mutable@ -> false name@ -> 'Seaside3' operator@ -> nil postLoadDoIt@ -> spec value: nil preLoadDoIt@ -> spec value: nil project@ -> ConfigurationOfMagritte3(3.0-baseline [ConfigurationOfMagritte3], 3.0 [ConfigurationOfMagritte3], 3.0.1 [ConfigurationOfMagritte3], 3.0.2 [C... projectPackage@ -> spec name: 'ConfigurationOfSeaside3'; repository: 'http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main'. repositories@ -> spec repository: 'http://smalltalkhub.com/mc/Seaside/MetacelloConfigurations/main' versionString@ -> '3.1.3.1' ```
process
confusion when loading which loads baseline from configuraitonofmagrite i have locked in an image i load which references but loads and the project registration is pretty confused the uses a baseline to load seaside where is the baselineprojectspec from anorderedcollection baselineofgrease from github gsdevkit grease master repository baselineofrb from github dale class metacelloprojectregistration oop committed true baselineprojectspec nil configurationprojectspec spec name versionstring preloaddoit nil postloaddoit nil loads core javascript rss filesystem loadedinimage true locked false mutable false projectname versioninfo ametacelloprojectregistrationversioninfo where s the sha ametacelloprojectregistrationversioninfo from class metacelloprojectregistrationversioninfo oop committed true projectversion versionstring what the heck is configurationofmagritte doing in there spec name versionstring preloaddoit nil postloaddoit nil loads core javascript rss filesystem from class metacellomcconfigurationofprojectspec oop committed true classname file nil loader nil loads anarray core javascript rss filesystem welcome mutable false name operator nil postloaddoit spec value nil preloaddoit spec value nil project baseline c projectpackage spec name repository repositories spec repository versionstring
1
216,553
16,770,344,442
IssuesEvent
2021-06-14 14:08:00
onias-rocha/desafio_quality
https://api.github.com/repos/onias-rocha/desafio_quality
closed
TU-0002: Verifique se o bairro de entrada existe no repositório de bairros
test
Se cumprir: Permite continuar normalmente. Se não cumprir: Relate a incompatibilidade com uma exceção.
1.0
TU-0002: Verifique se o bairro de entrada existe no repositório de bairros - Se cumprir: Permite continuar normalmente. Se não cumprir: Relate a incompatibilidade com uma exceção.
non_process
tu verifique se o bairro de entrada existe no repositório de bairros se cumprir permite continuar normalmente se não cumprir relate a incompatibilidade com uma exceção
0