Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
13,447
| 15,884,985,856
|
IssuesEvent
|
2021-04-09 19:47:42
|
zotero/zotero
|
https://api.github.com/repos/zotero/zotero
|
opened
|
Programmatically disable App Nap during word processor operations
|
Word Processor Integration
|
https://forums.zotero.org/discussion/comment/380201/#Comment_380201
Hopefully we can do this via js-ctypes: https://lapcatsoftware.com/articles/prevent-app-nap.html
|
1.0
|
Programmatically disable App Nap during word processor operations - https://forums.zotero.org/discussion/comment/380201/#Comment_380201
Hopefully we can do this via js-ctypes: https://lapcatsoftware.com/articles/prevent-app-nap.html
|
process
|
programmatically disable app nap during word processor operations hopefully we can do this via js ctypes
| 1
|
3,034
| 6,038,009,591
|
IssuesEvent
|
2017-06-09 20:13:58
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
opened
|
test: investigate flakiness - parallel/test-process-external-stdio-close
|
freebsd process test
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: `master`
* **Platform**: freeBSD
* **Subsystem**: test,process
<!-- Enter your issue details below this comment. -->
```
919 parallel/test-process-external-stdio-close
duration_ms 0.429
severity crashed
stack oh no!
exit code: CRASHED (Signal: 11)
```
https://ci.nodejs.org/job/node-test-commit-freebsd/9643/nodes=freebsd11-x64/
|
1.0
|
test: investigate flakiness - parallel/test-process-external-stdio-close - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: `master`
* **Platform**: freeBSD
* **Subsystem**: test,process
<!-- Enter your issue details below this comment. -->
```
919 parallel/test-process-external-stdio-close
duration_ms 0.429
severity crashed
stack oh no!
exit code: CRASHED (Signal: 11)
```
https://ci.nodejs.org/job/node-test-commit-freebsd/9643/nodes=freebsd11-x64/
|
process
|
test investigate flakiness parallel test process external stdio close thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version master platform freebsd subsystem test process parallel test process external stdio close duration ms severity crashed stack oh no exit code crashed signal
| 1
|
186,580
| 14,399,965,819
|
IssuesEvent
|
2020-12-03 11:41:08
|
MetaCell/geppetto-scidash
|
https://api.github.com/repos/MetaCell/geppetto-scidash
|
closed
|
Implement test to ensure neuroml-db parsing won't be broken
|
enhancement estimate: 2 testing
|
Add a new test scenario in order to ensure that the parameters from the neuroml model are read correctly.
This scenario can be placed in the current flow of the model created, so we don't need to re-do the login and all the previous steps required.
Assuming we are already logged in, these are the steps:
- Go to the models view
- Click the + icon to add a new model
- Fill the name model with "GranuleModel"
- Fill the link with https://neuroml-db.org/model_info?model_id=NMLCL000002
- Select the first choise from the model class top-down selection (I think it is LEMS model).
- Open the Params dialog.
- Check the top-down menu with the number of pages and ensure there are 16 pages available of params. If the 16 pages are there the test can be considered passed.
|
1.0
|
Implement test to ensure neuroml-db parsing won't be broken - Add a new test scenario in order to ensure that the parameters from the neuroml model are read correctly.
This scenario can be placed in the current flow of the model created, so we don't need to re-do the login and all the previous steps required.
Assuming we are already logged in, these are the steps:
- Go to the models view
- Click the + icon to add a new model
- Fill the name model with "GranuleModel"
- Fill the link with https://neuroml-db.org/model_info?model_id=NMLCL000002
- Select the first choise from the model class top-down selection (I think it is LEMS model).
- Open the Params dialog.
- Check the top-down menu with the number of pages and ensure there are 16 pages available of params. If the 16 pages are there the test can be considered passed.
|
non_process
|
implement test to ensure neuroml db parsing won t be broken add a new test scenario in order to ensure that the parameters from the neuroml model are read correctly this scenario can be placed in the current flow of the model created so we don t need to re do the login and all the previous steps required assuming we are already logged in these are the steps go to the models view click the icon to add a new model fill the name model with granulemodel fill the link with select the first choise from the model class top down selection i think it is lems model open the params dialog check the top down menu with the number of pages and ensure there are pages available of params if the pages are there the test can be considered passed
| 0
|
5,911
| 8,728,797,820
|
IssuesEvent
|
2018-12-10 18:24:16
|
bounswe/bounswe2018group1
|
https://api.github.com/repos/bounswe/bounswe2018group1
|
closed
|
Milestone #5 deliverables
|
Position: In-Process Who: Group-Work
|
- [x] Register with validation
- [x] Login with validation
- [x] Show User profile page
- [x] Edit user profile page
- [x] Show memory
- [x] Add memory
- [x] Milestone report
- [x] Milestone presentation
|
1.0
|
Milestone #5 deliverables - - [x] Register with validation
- [x] Login with validation
- [x] Show User profile page
- [x] Edit user profile page
- [x] Show memory
- [x] Add memory
- [x] Milestone report
- [x] Milestone presentation
|
process
|
milestone deliverables register with validation login with validation show user profile page edit user profile page show memory add memory milestone report milestone presentation
| 1
|
176,404
| 21,411,029,278
|
IssuesEvent
|
2022-04-22 05:58:41
|
pazhanivel07/frameworks_base_Aosp10_r33
|
https://api.github.com/repos/pazhanivel07/frameworks_base_Aosp10_r33
|
opened
|
CVE-2019-2232 (High) detected in baseandroid-10.0.0_r46
|
security vulnerability
|
## CVE-2019-2232 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>baseandroid-10.0.0_r46</b></p></summary>
<p>
<p>Android framework classes and services</p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/base>https://android.googlesource.com/platform/frameworks/base</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_base_Aosp10_r33/commit/d0a412c03562493a433dc7e698ff88ab06a3468a">d0a412c03562493a433dc7e698ff88ab06a3468a</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/core/java/android/text/TextLine.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In handleRun of TextLine.java, there is a possible application crash due to improper input validation. This could lead to remote denial of service when processing Unicode with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-8.0 Android-8.1 Android-9 Android-10Android ID: A-140632678
<p>Publish Date: 2019-12-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-2232>CVE-2019-2232</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-2232">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-2232</a></p>
<p>Release Date: 2019-12-06</p>
<p>Fix Resolution: android-8.0.0_r41;android-8.1.0_r71;android-9.0.0_r51;android-10.0.0_r17</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-2232 (High) detected in baseandroid-10.0.0_r46 - ## CVE-2019-2232 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>baseandroid-10.0.0_r46</b></p></summary>
<p>
<p>Android framework classes and services</p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/base>https://android.googlesource.com/platform/frameworks/base</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_base_Aosp10_r33/commit/d0a412c03562493a433dc7e698ff88ab06a3468a">d0a412c03562493a433dc7e698ff88ab06a3468a</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/core/java/android/text/TextLine.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In handleRun of TextLine.java, there is a possible application crash due to improper input validation. This could lead to remote denial of service when processing Unicode with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-8.0 Android-8.1 Android-9 Android-10Android ID: A-140632678
<p>Publish Date: 2019-12-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-2232>CVE-2019-2232</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-2232">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-2232</a></p>
<p>Release Date: 2019-12-06</p>
<p>Fix Resolution: android-8.0.0_r41;android-8.1.0_r71;android-9.0.0_r51;android-10.0.0_r17</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in baseandroid cve high severity vulnerability vulnerable library baseandroid android framework classes and services library home page a href found in head commit a href found in base branch main vulnerable source files core java android text textline java vulnerability details in handlerun of textline java there is a possible application crash due to improper input validation this could lead to remote denial of service when processing unicode with no additional execution privileges needed user interaction is not needed for exploitation product androidversions android android android android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android android android android step up your open source security game with whitesource
| 0
|
17,420
| 23,241,397,682
|
IssuesEvent
|
2022-08-03 15:52:59
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Internal: Fix flaky tests in circle CI
|
type: chore stage: backlog process: flaky test
|
### Current behavior
We currently have several flaky tests that need to be retried several times to pass.
### Desired behavior
Cypress tests should not be flakey and offer
### Test code to reproduce
Submit a PR to cypress and watch the circle ci build.
We can identify the flaky tests to focus on by using the cypress flak report: https://dashboard.cypress.io/projects/ypt4pf/analytics/flaky-tests
and circle CI's report: https://app.circleci.com/insights/github/cypress-io/cypress/workflows/linux/tests?branch=develop&reporting-window=last-30-days
### Cypress Version
n/a
### Other
_No response_
|
1.0
|
Internal: Fix flaky tests in circle CI - ### Current behavior
We currently have several flaky tests that need to be retried several times to pass.
### Desired behavior
Cypress tests should not be flakey and offer
### Test code to reproduce
Submit a PR to cypress and watch the circle ci build.
We can identify the flaky tests to focus on by using the cypress flak report: https://dashboard.cypress.io/projects/ypt4pf/analytics/flaky-tests
and circle CI's report: https://app.circleci.com/insights/github/cypress-io/cypress/workflows/linux/tests?branch=develop&reporting-window=last-30-days
### Cypress Version
n/a
### Other
_No response_
|
process
|
internal fix flaky tests in circle ci current behavior we currently have several flaky tests that need to be retried several times to pass desired behavior cypress tests should not be flakey and offer test code to reproduce submit a pr to cypress and watch the circle ci build we can identify the flaky tests to focus on by using the cypress flak report and circle ci s report cypress version n a other no response
| 1
|
2,014
| 4,837,107,766
|
IssuesEvent
|
2016-11-08 21:36:05
|
cliffparnitzky/FormDependentMandatoryField
|
https://api.github.com/repos/cliffparnitzky/FormDependentMandatoryField
|
closed
|
Only allow usage at fields which are submittable
|
Improvement ⚙ - Processed
|
Currently it is also possible to activate dependent mandatory field for a send button, explanation or headline ... this should not be possible
|
1.0
|
Only allow usage at fields which are submittable - Currently it is also possible to activate dependent mandatory field for a send button, explanation or headline ... this should not be possible
|
process
|
only allow usage at fields which are submittable currently it is also possible to activate dependent mandatory field for a send button explanation or headline this should not be possible
| 1
|
1,472
| 4,050,254,770
|
IssuesEvent
|
2016-05-23 17:35:14
|
brucemiller/LaTeXML
|
https://api.github.com/repos/brucemiller/LaTeXML
|
closed
|
Option To Suppress Section Links With '--splitat' Switch
|
enhancement postprocessing
|
Thank you for this great project.
When using the --splitat switch while making an ePub, the epub3 XSL will indirectly call (I believe) Split.pm. The result is automatically created section links (if --splitat=section) to the adjacent sections. 'latexmlc' does through LaTeXML-epub3.xsl, i.e.:
<!-- Include all LaTeXML to xhtml modules -->
<xsl:import href="LaTeXML-all-xhtml.xsl"/>
Somewhere in there is the call to make the section links--links that are not always wanted. Perhaps a '--no-split-links' switch would be ideal. Alternatively, where would I comment the code base to safely remove the automatically created section links?
Best,
Cooper Stevenson
|
1.0
|
Option To Suppress Section Links With '--splitat' Switch - Thank you for this great project.
When using the --splitat switch while making an ePub, the epub3 XSL will indirectly call (I believe) Split.pm. The result is automatically created section links (if --splitat=section) to the adjacent sections. 'latexmlc' does through LaTeXML-epub3.xsl, i.e.:
<!-- Include all LaTeXML to xhtml modules -->
<xsl:import href="LaTeXML-all-xhtml.xsl"/>
Somewhere in there is the call to make the section links--links that are not always wanted. Perhaps a '--no-split-links' switch would be ideal. Alternatively, where would I comment the code base to safely remove the automatically created section links?
Best,
Cooper Stevenson
|
process
|
option to suppress section links with splitat switch thank you for this great project when using the splitat switch while making an epub the xsl will indirectly call i believe split pm the result is automatically created section links if splitat section to the adjacent sections latexmlc does through latexml xsl i e somewhere in there is the call to make the section links links that are not always wanted perhaps a no split links switch would be ideal alternatively where would i comment the code base to safely remove the automatically created section links best cooper stevenson
| 1
|
66,285
| 3,252,008,265
|
IssuesEvent
|
2015-10-19 13:04:36
|
Quaggles/Icarus
|
https://api.github.com/repos/Quaggles/Icarus
|
closed
|
Can't retreat from character screen to main menu after pressing enter to grab a character
|
bug Medium Priority programming
|
need to press b then lots of a's
|
1.0
|
Can't retreat from character screen to main menu after pressing enter to grab a character - need to press b then lots of a's
|
non_process
|
can t retreat from character screen to main menu after pressing enter to grab a character need to press b then lots of a s
| 0
|
19,411
| 25,556,407,336
|
IssuesEvent
|
2022-11-30 07:08:40
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Wed, 30 Nov 22
|
event camera white balance image signal processing image signal process raw image events camera color contrast AWBISP compressionRAW
|
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWBISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compressionRAW
There is no result
## Keyword: raw image
### Learning Visual Planning Models from Partially Observed Images
- **Authors:** Kebing Jin, Zhanhao Xiao, Hankui Hankz Zhuo, Hai Wan, Jiaran Cai
- **Subjects:** Machine Learning (cs.LG); Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://xxx.itp.ac.cn/abs/2211.15666
- **Pdf link:** https://xxx.itp.ac.cn/pdf/2211.15666
- **Abstract**
There has been increasing attention on planning model learning in classical planning. Most existing approaches, however, focus on learning planning models from structured data in symbolic representations. It is often difficult to obtain such structured data in real-world scenarios. Although a number of approaches have been developed for learning planning models from fully observed unstructured data (e.g., images), in many scenarios raw observations are often incomplete. In this paper, we provide a novel framework, \aType{Recplan}, for learning a transition model from partially observed raw image traces. More specifically, by considering the preceding and subsequent images in a trace, we learn the latent state representations of raw observations and then build a transition model based on such representations. Additionally, we propose a neural-network-based approach to learn a heuristic model that estimates the distance toward a given goal observation. Based on the learned transition model and heuristic model, we implement a classical planner for images. We exhibit empirically that our approach is more effective than a state-of-the-art approach of learning visual planning models in the environment with incomplete observations.
|
2.0
|
New submissions for Wed, 30 Nov 22 - ## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWBISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compressionRAW
There is no result
## Keyword: raw image
### Learning Visual Planning Models from Partially Observed Images
- **Authors:** Kebing Jin, Zhanhao Xiao, Hankui Hankz Zhuo, Hai Wan, Jiaran Cai
- **Subjects:** Machine Learning (cs.LG); Artificial Intelligence (cs.AI); Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://xxx.itp.ac.cn/abs/2211.15666
- **Pdf link:** https://xxx.itp.ac.cn/pdf/2211.15666
- **Abstract**
There has been increasing attention on planning model learning in classical planning. Most existing approaches, however, focus on learning planning models from structured data in symbolic representations. It is often difficult to obtain such structured data in real-world scenarios. Although a number of approaches have been developed for learning planning models from fully observed unstructured data (e.g., images), in many scenarios raw observations are often incomplete. In this paper, we provide a novel framework, \aType{Recplan}, for learning a transition model from partially observed raw image traces. More specifically, by considering the preceding and subsequent images in a trace, we learn the latent state representations of raw observations and then build a transition model based on such representations. Additionally, we propose a neural-network-based approach to learn a heuristic model that estimates the distance toward a given goal observation. Based on the learned transition model and heuristic model, we implement a classical planner for images. We exhibit empirically that our approach is more effective than a state-of-the-art approach of learning visual planning models in the environment with incomplete observations.
|
process
|
new submissions for wed nov keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awbisp there is no result keyword image signal processing there is no result keyword image signal process there is no result keyword compressionraw there is no result keyword raw image learning visual planning models from partially observed images authors kebing jin zhanhao xiao hankui hankz zhuo hai wan jiaran cai subjects machine learning cs lg artificial intelligence cs ai computer vision and pattern recognition cs cv arxiv link pdf link abstract there has been increasing attention on planning model learning in classical planning most existing approaches however focus on learning planning models from structured data in symbolic representations it is often difficult to obtain such structured data in real world scenarios although a number of approaches have been developed for learning planning models from fully observed unstructured data e g images in many scenarios raw observations are often incomplete in this paper we provide a novel framework atype recplan for learning a transition model from partially observed raw image traces more specifically by considering the preceding and subsequent images in a trace we learn the latent state representations of raw observations and then build a transition model based on such representations additionally we propose a neural network based approach to learn a heuristic model that estimates the distance toward a given goal observation based on the learned transition model and heuristic model we implement a classical planner for images we exhibit empirically that our approach is more effective than a state of the art approach of learning visual planning models in the environment with incomplete observations
| 1
|
337
| 2,792,600,671
|
IssuesEvent
|
2015-05-11 02:57:07
|
openconnectome/m2g
|
https://api.github.com/repos/openconnectome/m2g
|
closed
|
Provide desikan atlas with regions 0-70
|
data processing
|
Lets abstract away that funny conversion we have to do to make it 0-70 so gengraph is less obscure for small graphs.
|
1.0
|
Provide desikan atlas with regions 0-70 - Lets abstract away that funny conversion we have to do to make it 0-70 so gengraph is less obscure for small graphs.
|
process
|
provide desikan atlas with regions lets abstract away that funny conversion we have to do to make it so gengraph is less obscure for small graphs
| 1
|
173,232
| 14,405,734,903
|
IssuesEvent
|
2020-12-03 19:09:12
|
Gigas002/GTiff2Tiles
|
https://api.github.com/repos/Gigas002/GTiff2Tiles
|
closed
|
Update README and CHANGELOG
|
documentation grabbed
|
Update these docs before releasing 2.0.0:
- [x] Update README
- [x] Update CHANGELOG
|
1.0
|
Update README and CHANGELOG - Update these docs before releasing 2.0.0:
- [x] Update README
- [x] Update CHANGELOG
|
non_process
|
update readme and changelog update these docs before releasing update readme update changelog
| 0
|
13,413
| 15,879,202,344
|
IssuesEvent
|
2021-04-09 12:08:06
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
missing parent effector-mediated suppression of host innate immune response by symbiont
|
multi-species process parent relationship query
|
GO:0140403 | effector-mediated suppression of host innate immune response by symbiont
is not a descendant of
GO:0140590 effector-mediated suppression of host defenses by symbiont
Isn't that always true?
|
1.0
|
missing parent effector-mediated suppression of host innate immune response by symbiont -
GO:0140403 | effector-mediated suppression of host innate immune response by symbiont
is not a descendant of
GO:0140590 effector-mediated suppression of host defenses by symbiont
Isn't that always true?
|
process
|
missing parent effector mediated suppression of host innate immune response by symbiont go effector mediated suppression of host innate immune response by symbiont is not a descendant of go effector mediated suppression of host defenses by symbiont isn t that always true
| 1
|
447
| 2,874,869,017
|
IssuesEvent
|
2015-06-09 02:22:48
|
besasm/EMGAATS
|
https://api.github.com/repos/besasm/EMGAATS
|
opened
|
Deploy SWMM5
|
bug process question
|
Figure out which weir/orifice setting is creating overly high nodes.
Determine effect of inflow controls on simulation class.
i.e. Do we create weirs or orifices that are not in the network? or are we only creating simulation links and simulation nodes
Add default outfall creation. (Will this be overridden by Boundary Condition setting)
Test the storm class on deploy.
|
1.0
|
Deploy SWMM5 - Figure out which weir/orifice setting is creating overly high nodes.
Determine effect of inflow controls on simulation class.
i.e. Do we create weirs or orifices that are not in the network? or are we only creating simulation links and simulation nodes
Add default outfall creation. (Will this be overridden by Boundary Condition setting)
Test the storm class on deploy.
|
process
|
deploy figure out which weir orifice setting is creating overly high nodes determine effect of inflow controls on simulation class i e do we create weirs or orifices that are not in the network or are we only creating simulation links and simulation nodes add default outfall creation will this be overridden by boundary condition setting test the storm class on deploy
| 1
|
14,302
| 17,289,672,764
|
IssuesEvent
|
2021-07-24 13:15:04
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Profiles from Lines SAGA missing tail
|
Bug Feedback Processing
|
**Describe the bug**
When using the SAGA tool "Profiles from Lines" it is not guaranteed that the last point of the profil lies at the end of the examined line (but it is possible that these coincide). If that is the case, the reported lengths are obviously wrong. Because it is a SAGA issue, this could be the starting point to create a QGIS inherent tool for obtaining the surface length by a DEM.
**How to Reproduce**
1. Create a new line shape layer (I used CRS: 25832 ETRS89/UTM32N) where you have a DEM available.
2. Use the SAGA tool "Profils from Lines" to obtain the surface length along that line.
3. Check whether the last point of the profil is exactly at the end of the line.



**QGIS and OS versions**
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.20.1-Odense | QGIS code revision | 1c3c5cd6
-- | -- | -- | --
Qt version | 5.15.2
Python version | 3.9.5
GDAL/OGR version | 3.3.1
PROJ version | 8.1.0
EPSG Registry database version | v10.018 (2021-04-02)
GEOS version | 3.9.1-CAPI-1.14.2
SQLite version | 3.35.2
PDAL version | 2.3.0
PostgreSQL client version | 13.0
SpatiaLite version | 5.0.1
QWT version | 6.1.3
QScintilla2 version | 2.11.5
OS version | Windows 10 Version 2009
| | |
Active Python plugins | db_managerMetaSearchprocessing
</body></html><!--EndFragment-->
**Additional context**
I am using a windows machine with a new profile.
This behaviour occurred in the version 3.16.8, 3.18.3 and 3.20.1.
The archive contains a dem and the origin line which was used to go through the steps described above.
[demand2dline.zip](https://github.com/qgis/QGIS/files/6872082/demand2dline.zip)
|
1.0
|
Profiles from Lines SAGA missing tail - **Describe the bug**
When using the SAGA tool "Profiles from Lines" it is not guaranteed that the last point of the profil lies at the end of the examined line (but it is possible that these coincide). If that is the case, the reported lengths are obviously wrong. Because it is a SAGA issue, this could be the starting point to create a QGIS inherent tool for obtaining the surface length by a DEM.
**How to Reproduce**
1. Create a new line shape layer (I used CRS: 25832 ETRS89/UTM32N) where you have a DEM available.
2. Use the SAGA tool "Profils from Lines" to obtain the surface length along that line.
3. Check whether the last point of the profil is exactly at the end of the line.



**QGIS and OS versions**
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.20.1-Odense | QGIS code revision | 1c3c5cd6
-- | -- | -- | --
Qt version | 5.15.2
Python version | 3.9.5
GDAL/OGR version | 3.3.1
PROJ version | 8.1.0
EPSG Registry database version | v10.018 (2021-04-02)
GEOS version | 3.9.1-CAPI-1.14.2
SQLite version | 3.35.2
PDAL version | 2.3.0
PostgreSQL client version | 13.0
SpatiaLite version | 5.0.1
QWT version | 6.1.3
QScintilla2 version | 2.11.5
OS version | Windows 10 Version 2009
| | |
Active Python plugins | db_managerMetaSearchprocessing
</body></html><!--EndFragment-->
**Additional context**
I am using a windows machine with a new profile.
This behaviour occurred in the version 3.16.8, 3.18.3 and 3.20.1.
The archive contains a dem and the origin line which was used to go through the steps described above.
[demand2dline.zip](https://github.com/qgis/QGIS/files/6872082/demand2dline.zip)
|
process
|
profiles from lines saga missing tail describe the bug when using the saga tool profiles from lines it is not guaranteed that the last point of the profil lies at the end of the examined line but it is possible that these coincide if that is the case the reported lengths are obviously wrong because it is a saga issue this could be the starting point to create a qgis inherent tool for obtaining the surface length by a dem how to reproduce create a new line shape layer i used crs where you have a dem available use the saga tool profils from lines to obtain the surface length along that line check whether the last point of the profil is exactly at the end of the line qgis and os versions doctype html public dtd html en p li white space pre wrap qgis version odense qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version spatialite version qwt version version os version windows version active python plugins db managermetasearchprocessing additional context i am using a windows machine with a new profile this behaviour occurred in the version and the archive contains a dem and the origin line which was used to go through the steps described above
| 1
|
3,435
| 6,534,034,194
|
IssuesEvent
|
2017-08-31 09:07:32
|
gaocegege/Processing.R
|
https://api.github.com/repos/gaocegege/Processing.R
|
closed
|
Welcome to have a try on Processing.R
|
community/processing
|
Processing.R has released 5 versions. You can get the latest mode from [Release Page](https://github.com/gaocegege/Processing.R/releases).
After you download the mode, place it into Processing "modes" directory:
- macOS: `${HOME}/Documents/Processing/modes`
- Linux: `${HOME}/sketchbook/modes`
- Windows: `C:\Users\<user>\Documents\Processing\modes`
Now the features in Processing.R include:
### Built-in functions in Processing
The documentation website is [https://processing-r.github.io/Processing.R-docs/](https://processing-r.github.io/Processing.R-docs/). This documentation is currently incomplete. Most Processing functions are theoretically supported in Processing.R, but many functions have not been tested and some pages have not been edited yet to reflect differences from other Processing modes.
### Libraries in Processing: `importLibrary()`
Processing.R supports importing standard Processing(Java) libraries that enrich the functionality of Processing. The function `importLibrary()` imports new libraries manually. This has been tested with one library: [peasycam](http://mrfeinberg.com/peasycam/), the "dead-simple mouse-driven camera for Processing."
Before trying the example code below, first install the corresponding library `peasycam` -- for example using the PDE Contribution Manager > Library.
```r
settings <- function() {
importLibrary("peasycam")
size(200, 200, P3D)
}
setup <- function() {
cam = PeasyCam$new(processing, 100)
cam$setMinimumDistance(50)
cam$setMaximumDistance(500)
}
draw <- function() {
rotateX(-.5)
rotateY(-.5)
background(0)
fill(255, 0, 0)
box(30)
pushMatrix()
translate(0, 0, 20)
fill(0, 0, 255)
box(5)
popMatrix()
}
```
### R Packages: `library()`
Processing.R has limited support for R packages. It will automatically download R packages that are requested using the `library()` function, so you can use packages directly.
Here is an example using the `foreach` package:
```r
library(foreach)
foreach(i=1:3) %do%
print(sqrt(i))
```
In practice we have only found a few R packages so far that work with Processing.R "out of the box." This is because the package must be pure R **and** all of its dependencies must also be pure R. There is [renjin list of R packages](http://packages.renjin.org/) which lists their compatibility with the renjin JVM. Any package fully supported in renjin is theoretically supported in Processing.R.
## Limitations in Processing.R
Processing.R is in active development as an experimental pre-release version.
**Static sketches:** Processing.R does not have a good support for detecting static/active/mix mode. We recommend that all sketches be written in full active mode, defining a separate `settings`, `setup` and `draw`. Even simple sketches should be wrapped in `draw()`. For example, do not write:
```R
line(0, 10, 90, 100)
```
That may cause bugs. Instead, write:
```R
draw <- function() {
line(0, 10, 90, 100)
}
```
Please try our experimental mode and give us your feedback :) If you want to contribute to this mode, there are [issues for new contributors](https://github.com/gaocegege/Processing.R/issues?q=is%3Aissue+is%3Aopen+label%3Afor-new-contributors) and the [architecture documentation](https://github.com/gaocegege/Processing.R/blob/master/raw-docs/architecture.md).
If you have any problem about the mode, come chat at the [Processing.R gitter channel](https://gitter.im/gaocegege/Processing.R) :tada:
|
1.0
|
Welcome to have a try on Processing.R - Processing.R has released 5 versions. You can get the latest mode from [Release Page](https://github.com/gaocegege/Processing.R/releases).
After you download the mode, place it into Processing "modes" directory:
- macOS: `${HOME}/Documents/Processing/modes`
- Linux: `${HOME}/sketchbook/modes`
- Windows: `C:\Users\<user>\Documents\Processing\modes`
Now the features in Processing.R include:
### Built-in functions in Processing
The documentation website is [https://processing-r.github.io/Processing.R-docs/](https://processing-r.github.io/Processing.R-docs/). This documentation is currently incomplete. Most Processing functions are theoretically supported in Processing.R, but many functions have not been tested and some pages have not been edited yet to reflect differences from other Processing modes.
### Libraries in Processing: `importLibrary()`
Processing.R supports importing standard Processing(Java) libraries that enrich the functionality of Processing. The function `importLibrary()` imports new libraries manually. This has been tested with one library: [peasycam](http://mrfeinberg.com/peasycam/), the "dead-simple mouse-driven camera for Processing."
Before trying the example code below, first install the corresponding library `peasycam` -- for example using the PDE Contribution Manager > Library.
```r
settings <- function() {
importLibrary("peasycam")
size(200, 200, P3D)
}
setup <- function() {
cam = PeasyCam$new(processing, 100)
cam$setMinimumDistance(50)
cam$setMaximumDistance(500)
}
draw <- function() {
rotateX(-.5)
rotateY(-.5)
background(0)
fill(255, 0, 0)
box(30)
pushMatrix()
translate(0, 0, 20)
fill(0, 0, 255)
box(5)
popMatrix()
}
```
### R Packages: `library()`
Processing.R has limited support for R packages. It will automatically download R packages that are requested using the `library()` function, so you can use packages directly.
Here is an example using the `foreach` package:
```r
library(foreach)
foreach(i=1:3) %do%
print(sqrt(i))
```
In practice we have only found a few R packages so far that work with Processing.R "out of the box." This is because the package must be pure R **and** all of its dependencies must also be pure R. There is [renjin list of R packages](http://packages.renjin.org/) which lists their compatibility with the renjin JVM. Any package fully supported in renjin is theoretically supported in Processing.R.
## Limitations in Processing.R
Processing.R is in active development as an experimental pre-release version.
**Static sketches:** Processing.R does not have a good support for detecting static/active/mix mode. We recommend that all sketches be written in full active mode, defining a separate `settings`, `setup` and `draw`. Even simple sketches should be wrapped in `draw()`. For example, do not write:
```R
line(0, 10, 90, 100)
```
That may cause bugs. Instead, write:
```R
draw <- function() {
line(0, 10, 90, 100)
}
```
Please try our experimental mode and give us your feedback :) If you want to contribute to this mode, there are [issues for new contributors](https://github.com/gaocegege/Processing.R/issues?q=is%3Aissue+is%3Aopen+label%3Afor-new-contributors) and the [architecture documentation](https://github.com/gaocegege/Processing.R/blob/master/raw-docs/architecture.md).
If you have any problem about the mode, come chat at the [Processing.R gitter channel](https://gitter.im/gaocegege/Processing.R) :tada:
|
process
|
welcome to have a try on processing r processing r has released versions you can get the latest mode from after you download the mode place it into processing modes directory macos home documents processing modes linux home sketchbook modes windows c users documents processing modes now the features in processing r include built in functions in processing the documentation website is this documentation is currently incomplete most processing functions are theoretically supported in processing r but many functions have not been tested and some pages have not been edited yet to reflect differences from other processing modes libraries in processing importlibrary processing r supports importing standard processing java libraries that enrich the functionality of processing the function importlibrary imports new libraries manually this has been tested with one library the dead simple mouse driven camera for processing before trying the example code below first install the corresponding library peasycam for example using the pde contribution manager library r settings function importlibrary peasycam size setup function cam peasycam new processing cam setminimumdistance cam setmaximumdistance draw function rotatex rotatey background fill box pushmatrix translate fill box popmatrix r packages library processing r has limited support for r packages it will automatically download r packages that are requested using the library function so you can use packages directly here is an example using the foreach package r library foreach foreach i do print sqrt i in practice we have only found a few r packages so far that work with processing r out of the box this is because the package must be pure r and all of its dependencies must also be pure r there is which lists their compatibility with the renjin jvm any package fully supported in renjin is theoretically supported in processing r limitations in processing r processing r is in active development as an experimental pre release version static sketches processing r does not have a good support for detecting static active mix mode we recommend that all sketches be written in full active mode defining a separate settings setup and draw even simple sketches should be wrapped in draw for example do not write r line that may cause bugs instead write r draw function line please try our experimental mode and give us your feedback if you want to contribute to this mode there are and the if you have any problem about the mode come chat at the tada
| 1
|
83,983
| 10,455,588,791
|
IssuesEvent
|
2019-09-19 21:44:52
|
envoyproxy/envoy
|
https://api.github.com/repos/envoyproxy/envoy
|
opened
|
Better control on xDS deduplication based on protobuf
|
design proposal help wanted
|
From discussion in https://github.com/envoyproxy/envoy/pull/8231#discussion_r324191998
Key take aways:
- Prefer using version_info instead comparing new xDS delivered config with existing one, this also solves https://github.com/envoyproxy/envoy/issues/7676
- Add a UUID in ConfigSource for deduplication, instead of hashing `ConfigSource` proto to automatically dedup. This means RDS/SDS ConfigSource sent from LDS should be deduplicated at control plane.
|
1.0
|
Better control on xDS deduplication based on protobuf - From discussion in https://github.com/envoyproxy/envoy/pull/8231#discussion_r324191998
Key take aways:
- Prefer using version_info instead comparing new xDS delivered config with existing one, this also solves https://github.com/envoyproxy/envoy/issues/7676
- Add a UUID in ConfigSource for deduplication, instead of hashing `ConfigSource` proto to automatically dedup. This means RDS/SDS ConfigSource sent from LDS should be deduplicated at control plane.
|
non_process
|
better control on xds deduplication based on protobuf from discussion in key take aways prefer using version info instead comparing new xds delivered config with existing one this also solves add a uuid in configsource for deduplication instead of hashing configsource proto to automatically dedup this means rds sds configsource sent from lds should be deduplicated at control plane
| 0
|
125,265
| 4,954,973,369
|
IssuesEvent
|
2016-12-01 19:08:04
|
poldracklab/niworkflows
|
https://api.github.com/repos/poldracklab/niworkflows
|
opened
|
print more docker output in circle
|
priority:low
|
Currently it seems like a log level is set at WARNING.
If we print more we can reduce the timeout length back to the default, and debugging might be easier.
|
1.0
|
print more docker output in circle - Currently it seems like a log level is set at WARNING.
If we print more we can reduce the timeout length back to the default, and debugging might be easier.
|
non_process
|
print more docker output in circle currently it seems like a log level is set at warning if we print more we can reduce the timeout length back to the default and debugging might be easier
| 0
|
9,253
| 11,243,995,429
|
IssuesEvent
|
2020-01-10 05:31:34
|
ValveSoftware/Proton
|
https://api.github.com/repos/ValveSoftware/Proton
|
closed
|
FlatOut 4: Total Insanity (402130)
|
Game compatibility - Unofficial
|
# Compatibility Report
- Name of the game with compatibility issues: FlatOut 4: Total Insanity
- Steam AppID of the game: 402130
## System Information
- GPU: GeForce 930MX
- Driver/LLVM version: nvidia 440.44
- Kernel version: 5.0.0-37-generic 40~18.04.1-Ubuntu SMP Thu Nov 14 12:06:39 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
- Link to full system information report as [Gist](https://gist.github.com/r3d9u11/a08548e24f4cf17b8878868e4b90f0a1#file-flatout4_20200109)
- Proton version: 4.11-11
## I confirm:
- [x] that I haven't found an existing compatibility report for this game.
- [x] that I have checked whether there are updates for my system available.
<!-- Please add `PROTON_LOG=1 %command%` to the game's launch options and drag
and drop the generated `$HOME/steam-$APPID.log` into this issue report -->
[steam-402130.log](https://github.com/ValveSoftware/Proton/files/4042446/steam-402130.log)
## Symptoms <!-- What's the problem? -->
Can't launch the game
## Reproduction
Just install an run
<!--
1. You can find the Steam AppID in the URL of the shop page of the game.
e.g. for `The Witcher 3: Wild Hunt` the AppID is `292030`.
2. You can find your driver and Linux version, as well as your graphics
processor's name in the system information report of Steam.
3. You can retrieve a full system information report by clicking
`Help` > `System Information` in the Steam client on your machine.
4. Please copy it to your clipboard by pressing `Ctrl+A` and then `Ctrl+C`.
Then paste it in a [Gist](https://gist.github.com/) and post the link in
this issue.
5. Please search for open issues and pull requests by the name of the game and
find out whether they are relevant and should be referenced above.
-->
|
True
|
FlatOut 4: Total Insanity (402130) - # Compatibility Report
- Name of the game with compatibility issues: FlatOut 4: Total Insanity
- Steam AppID of the game: 402130
## System Information
- GPU: GeForce 930MX
- Driver/LLVM version: nvidia 440.44
- Kernel version: 5.0.0-37-generic 40~18.04.1-Ubuntu SMP Thu Nov 14 12:06:39 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
- Link to full system information report as [Gist](https://gist.github.com/r3d9u11/a08548e24f4cf17b8878868e4b90f0a1#file-flatout4_20200109)
- Proton version: 4.11-11
## I confirm:
- [x] that I haven't found an existing compatibility report for this game.
- [x] that I have checked whether there are updates for my system available.
<!-- Please add `PROTON_LOG=1 %command%` to the game's launch options and drag
and drop the generated `$HOME/steam-$APPID.log` into this issue report -->
[steam-402130.log](https://github.com/ValveSoftware/Proton/files/4042446/steam-402130.log)
## Symptoms <!-- What's the problem? -->
Can't launch the game
## Reproduction
Just install an run
<!--
1. You can find the Steam AppID in the URL of the shop page of the game.
e.g. for `The Witcher 3: Wild Hunt` the AppID is `292030`.
2. You can find your driver and Linux version, as well as your graphics
processor's name in the system information report of Steam.
3. You can retrieve a full system information report by clicking
`Help` > `System Information` in the Steam client on your machine.
4. Please copy it to your clipboard by pressing `Ctrl+A` and then `Ctrl+C`.
Then paste it in a [Gist](https://gist.github.com/) and post the link in
this issue.
5. Please search for open issues and pull requests by the name of the game and
find out whether they are relevant and should be referenced above.
-->
|
non_process
|
flatout total insanity compatibility report name of the game with compatibility issues flatout total insanity steam appid of the game system information gpu geforce driver llvm version nvidia kernel version generic ubuntu smp thu nov utc gnu linux link to full system information report as proton version i confirm that i haven t found an existing compatibility report for this game that i have checked whether there are updates for my system available please add proton log command to the game s launch options and drag and drop the generated home steam appid log into this issue report symptoms can t launch the game reproduction just install an run you can find the steam appid in the url of the shop page of the game e g for the witcher wild hunt the appid is you can find your driver and linux version as well as your graphics processor s name in the system information report of steam you can retrieve a full system information report by clicking help system information in the steam client on your machine please copy it to your clipboard by pressing ctrl a and then ctrl c then paste it in a and post the link in this issue please search for open issues and pull requests by the name of the game and find out whether they are relevant and should be referenced above
| 0
|
51,521
| 21,704,801,856
|
IssuesEvent
|
2022-05-10 08:39:42
|
Azure/azure-powershell
|
https://api.github.com/repos/Azure/azure-powershell
|
closed
|
Update-AzADDomainService : The certificate or password provided for LDAPs is invalid
|
bug customer-reported AAD Domain Service
|
### Description
Hi,
I'm trying to use Preview of module Az.ADDomainServices in actual version 0.1.0
I just try to update LDAPS certificate with a simple command like ->
```ps
$adds_rg_name = "RG-TEST"
$adds_domain = "contoso.com"
$ldaps_pfx_path = "C:\TEMP\certificate.pfx"
$ldaps_pfx_pass = "MyStrongPassword" | ConvertTo-SecureString -Force -AsPlainText
$adds_rg = Get-AzResourceGroup -Name $adds_rg_name
$adds = Get-AzADDomainService -Name $adds_domain -ResourceGroupName $adds_rg.ResourceGroupName
$ldaps_update = Update-AzADDomainService -InputObject $adds -LdapSettingLdaps $true -LdapSettingPfxCertificate $ldaps_pfx_path -LdapSettingPfxCertificatePassword $ldaps_pfx_pass
```
But I've this error :
`
Update-AzADDomainService_UpdateViaIdentityExpanded: script.ps1:283:13
Line |
283 | Az.ADDomainServices.internal\Update-AzADDomainService @PS …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| The certificate or password provided for LDAPs is invalid.
`
If I use the same file and password directly from GUI, all is working as expected !
Thanks
Regards
Alexandre
### Issue script & Debug output
```PowerShell
HTTP Method:
PATCH
Absolute Uri:
https://management.azure.com/subscriptions/xxxx/resourceGroups/RES-INFRA/providers/Microsoft.AAD/domainServices/contoso.com?api-version=2020-01-01
Headers:
x-ms-unique-id : 14
x-ms-client-request-id : 3713ef7f-e743-49ec-9713-9cd67e295827
CommandName : Az.ADDomainServices.internal\Update-AzADDomainService
FullCommandName : Update-AzADDomainService_UpdateViaIdentityExpanded
ParameterSetName : __AllParameterSets
User-Agent : AzurePowershell/v7.1.0,PSVersion/v7.2.3,Az.ADDomainServices/0.1.0
Body:
{
"properties": {
"ldapsSettings": {
"ldaps": "True",
"pfxCertificate": "C:\\TEMP\\certificate.pfx",
"pfxCertificatePassword": "MyStrongPassword"
}
}
}
DEBUG: BeforeCall:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
Accepted
Headers:
Cache-Control : no-cache
Pragma : no-cache
ETag : W/"datetime'2022-05-09T09%3A36%3A26.1385566Z'"
Location : https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01&operationResultResponseType=Location
x-ms-request-id : 2fe95953-3d8d-4133-b3b9-c4692ba8bafe
Azure-AsyncOperation : https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01
x-ms-ratelimit-remaining-subscription-writes: 1199
x-ms-correlation-request-id : 8b94e009-aec9-4c67-92df-67aa4e4c6bb3
x-ms-routing-request-id : FRANCECENTRAL:20220509T093633Z:8b94e009-aec9-4c67-92df-67aa4e4c6bb3
Strict-Transport-Security : max-age=31536000; includeSubDomains
X-Content-Type-Options : nosniff
Date : Mon, 09 May 2022 09:36:33 GMT
Body:
{
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.AAD/domainServices/contoso.com",
"name": "contoso.com",
"type": "Microsoft.AAD/domainServices",
"etag": "W/\"datetime'2022-05-09T09%3A36%3A26.1385566Z'\"",
"location": "westeurope",
"properties": {
"version": 2,
"tenantId": "xxxx",
"domainName": "contoso.com",
"deploymentId": "015ad100-5930-47ca-aeec-3648ec70e68f",
"syncOwner": "015ad100-5930-47ca-aeec-3648ec70e68f",
"replicaSets": [
{
"replicaSetId": "015ad100-5930-47ca-aeec-3648ec70e68f",
"location": "West Europe",
"subnetId": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.Network/virtualNetworks/xxxx/subnets/xxxx",
"domainControllerIpAddress": [
"x.x.x.x"
],
"externalAccessIpAddress": "",
"serviceStatus": "Running"
}
],
"ldapsSettings": {
"ldaps": "Enabled",
"publicCertificate": "xxxx",
"certificateThumbprint": "xxxx",
"certificateNotAfter": "2022-07-30T11:12:48Z",
"externalAccess": "Disabled"
},
"domainSecuritySettings": {
"ntlmV1": "Enabled",
"tlsV1": "Enabled",
"syncNtlmPasswords": "Enabled",
"syncKerberosPasswords": "Enabled",
"syncOnPremPasswords": "Enabled"
},
"filteredSync": "Enabled",
"resourceForestSettings": {
"settings": []
},
"notificationSettings": {
"notifyGlobalAdmins": "Disabled",
"notifyDcAdmins": "Disabled",
"additionalRecipients": [
"xxxx@contoso.com"
]
},
"sku": "Standard",
"provisioningState": "Accepted"
}
}
DEBUG: ResponseCreated:
DEBUG: DelayBeforePolling: Delaying 30 seconds before polling.
DEBUG: ============================ HTTP REQUEST ============================
HTTP Method:
GET
Absolute Uri:
https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01
Headers:
x-ms-unique-id : 15
x-ms-client-request-id : 3713ef7f-e743-49ec-9713-9cd67e295827
CommandName : Az.ADDomainServices.internal\Update-AzADDomainService
FullCommandName : Update-AzADDomainService_UpdateViaIdentityExpanded
ParameterSetName : __AllParameterSets
User-Agent : AzurePowershell/v7.1.0,PSVersion/v7.2.3,Az.ADDomainServices/0.1.0
Body:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
OK
Headers:
Cache-Control : no-cache
Pragma : no-cache
x-ms-request-id : 2a4fd315-f986-4f22-8d07-3954451cd4eb
x-ms-ratelimit-remaining-subscription-reads: 11999
x-ms-correlation-request-id : e1b62e5a-91f1-489a-af21-1a92a78e71e8
x-ms-routing-request-id : FRANCECENTRAL:20220509T093704Z:e1b62e5a-91f1-489a-af21-1a92a78e71e8
Strict-Transport-Security : max-age=31536000; includeSubDomains
X-Content-Type-Options : nosniff
Date : Mon, 09 May 2022 09:37:03 GMT
Body:
{
"id": "/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx",
"name": "xxxx",
"status": "Failed",
"startTime": "0001-01-01T08:00:00Z",
"endTime": "0001-01-01T08:00:00Z",
"percentComplete": 0.0,
"error": {
"code": "BadRequest",
"message": "The certificate or password provided for LDAPs is invalid."
}
}
DEBUG: Polling:
DEBUG: Finally:
Update-AzADDomainService_UpdateViaIdentityExpanded: C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283:13
Line |
283 | Az.ADDomainServices.internal\Update-AzADDomainService @PS …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| The certificate or password provided for LDAPs is invalid.
DEBUG: [CmdletProcessRecordAsyncEnd]: Finish HTTP process
DEBUG: CmdletProcessRecordAsyncEnd:
DEBUG: CmdletProcessRecordEnd:
DEBUG: AzureQoSEvent: Module: Az.ADDomainServices:0.1.0; CommandName: Update-AzADDomainService_UpdateViaIdentityExpanded; PSVersion: 7.2.3; IsSuccess: True; Duration: 00:00:00
DEBUG: Finish sending metric.
DEBUG: CmdletEndProcessing:
```
### Environment data
```PowerShell
Name Value
---- -----
PSVersion 7.2.3
PSEdition Core
GitCommitId 7.2.3
OS Microsoft Windows 10.0.22000
Platform Win32NT
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0…}
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
WSManStackVersion 3.0
```
### Module versions
```PowerShell
ModuleType Version PreRelease Name ExportedCommands
---------- ------- ---------- ---- ----------------
Script 2.7.5 Az.Accounts {Add-AzEnvironment, Clear-AzContext, Clear-AzDefault, Connect-AzAccount…}
Script 0.1.0 Az.ADDomainServices {Get-AzADDomainService, New-AzADDomainService, New-AzADDomainServiceForestTrust, New-AzADDomainServiceReplicaSet…}
Script 4.2.0 Az.KeyVault {Add-AzKeyVaultCertificate, Add-AzKeyVaultCertificateContact, Add-AzKeyVaultKey, Add-AzKeyVaultManagedStorageAccount…}
Script 5.2.0 Az.Resources {Export-AzResourceGroup, Export-AzTemplateSpec, Get-AzDenyAssignment, Get-AzDeployment…}
```
### Error output
```PowerShell
DEBUG: 11:41:46 - ResolveError begin processing with ParameterSet 'AnyErrorParameterSet'.
DEBUG: 11:41:46 - using account id 'xxxx@contoso.com'...
WARNING: Upcoming breaking changes in the cmdlet 'Resolve-AzError' :
The `Resolve-Error` alias will be removed in a future release. Please change any scripts that use this alias to use `Resolve-AzError` instead.
Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other information on breaking changes in Azure PowerShell.
HistoryId: 57
Message : The certificate or password provided for LDAPs is invalid.
StackTrace : at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdateViaIdentity(String viaIdentity, IDomainService body, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync
sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Cmdlets.UpdateAzADDomainService_UpdateViaIdentityExpanded.ProcessRecordAsync()
Exception : Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Runtime.UndeclaredResponseException
InvocationInfo : {Update-AzADDomainService_UpdateViaIdentityExpanded}
Line : Az.ADDomainServices.internal\Update-AzADDomainService @PSBoundParameters
Position : At C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283 char:13
+ Az.ADDomainServices.internal\Update-AzADDomainService @PS …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
HistoryId : 57
HistoryId: 54
Message : The certificate or password provided for LDAPs is invalid.
StackTrace : at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdateViaIdentity(String viaIdentity, IDomainService body, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync
sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Cmdlets.UpdateAzADDomainService_UpdateViaIdentityExpanded.ProcessRecordAsync()
Exception : Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Runtime.UndeclaredResponseException
InvocationInfo : {Update-AzADDomainService_UpdateViaIdentityExpanded}
Line : Az.ADDomainServices.internal\Update-AzADDomainService @PSBoundParameters
Position : At C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283 char:13
+ Az.ADDomainServices.internal\Update-AzADDomainService @PS …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
HistoryId : 54
The Azure PowerShell team is listening, please let us know how we are doing: https://aka.ms/azpssurvey?Q_CHL=ERROR.
DEBUG: AzureQoSEvent: Module: Az.Accounts:2.7.5; CommandName: Resolve-AzError; PSVersion: 7.2.3; IsSuccess: True; Duration: 00:00:00.2753625
DEBUG: Finish sending metric.
DEBUG: 11:41:47 - ResolveError end processing.
```
|
1.0
|
Update-AzADDomainService : The certificate or password provided for LDAPs is invalid - ### Description
Hi,
I'm trying to use Preview of module Az.ADDomainServices in actual version 0.1.0
I just try to update LDAPS certificate with a simple command like ->
```ps
$adds_rg_name = "RG-TEST"
$adds_domain = "contoso.com"
$ldaps_pfx_path = "C:\TEMP\certificate.pfx"
$ldaps_pfx_pass = "MyStrongPassword" | ConvertTo-SecureString -Force -AsPlainText
$adds_rg = Get-AzResourceGroup -Name $adds_rg_name
$adds = Get-AzADDomainService -Name $adds_domain -ResourceGroupName $adds_rg.ResourceGroupName
$ldaps_update = Update-AzADDomainService -InputObject $adds -LdapSettingLdaps $true -LdapSettingPfxCertificate $ldaps_pfx_path -LdapSettingPfxCertificatePassword $ldaps_pfx_pass
```
But I've this error :
`
Update-AzADDomainService_UpdateViaIdentityExpanded: script.ps1:283:13
Line |
283 | Az.ADDomainServices.internal\Update-AzADDomainService @PS …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| The certificate or password provided for LDAPs is invalid.
`
If I use the same file and password directly from GUI, all is working as expected !
Thanks
Regards
Alexandre
### Issue script & Debug output
```PowerShell
HTTP Method:
PATCH
Absolute Uri:
https://management.azure.com/subscriptions/xxxx/resourceGroups/RES-INFRA/providers/Microsoft.AAD/domainServices/contoso.com?api-version=2020-01-01
Headers:
x-ms-unique-id : 14
x-ms-client-request-id : 3713ef7f-e743-49ec-9713-9cd67e295827
CommandName : Az.ADDomainServices.internal\Update-AzADDomainService
FullCommandName : Update-AzADDomainService_UpdateViaIdentityExpanded
ParameterSetName : __AllParameterSets
User-Agent : AzurePowershell/v7.1.0,PSVersion/v7.2.3,Az.ADDomainServices/0.1.0
Body:
{
"properties": {
"ldapsSettings": {
"ldaps": "True",
"pfxCertificate": "C:\\TEMP\\certificate.pfx",
"pfxCertificatePassword": "MyStrongPassword"
}
}
}
DEBUG: BeforeCall:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
Accepted
Headers:
Cache-Control : no-cache
Pragma : no-cache
ETag : W/"datetime'2022-05-09T09%3A36%3A26.1385566Z'"
Location : https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01&operationResultResponseType=Location
x-ms-request-id : 2fe95953-3d8d-4133-b3b9-c4692ba8bafe
Azure-AsyncOperation : https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01
x-ms-ratelimit-remaining-subscription-writes: 1199
x-ms-correlation-request-id : 8b94e009-aec9-4c67-92df-67aa4e4c6bb3
x-ms-routing-request-id : FRANCECENTRAL:20220509T093633Z:8b94e009-aec9-4c67-92df-67aa4e4c6bb3
Strict-Transport-Security : max-age=31536000; includeSubDomains
X-Content-Type-Options : nosniff
Date : Mon, 09 May 2022 09:36:33 GMT
Body:
{
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.AAD/domainServices/contoso.com",
"name": "contoso.com",
"type": "Microsoft.AAD/domainServices",
"etag": "W/\"datetime'2022-05-09T09%3A36%3A26.1385566Z'\"",
"location": "westeurope",
"properties": {
"version": 2,
"tenantId": "xxxx",
"domainName": "contoso.com",
"deploymentId": "015ad100-5930-47ca-aeec-3648ec70e68f",
"syncOwner": "015ad100-5930-47ca-aeec-3648ec70e68f",
"replicaSets": [
{
"replicaSetId": "015ad100-5930-47ca-aeec-3648ec70e68f",
"location": "West Europe",
"subnetId": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.Network/virtualNetworks/xxxx/subnets/xxxx",
"domainControllerIpAddress": [
"x.x.x.x"
],
"externalAccessIpAddress": "",
"serviceStatus": "Running"
}
],
"ldapsSettings": {
"ldaps": "Enabled",
"publicCertificate": "xxxx",
"certificateThumbprint": "xxxx",
"certificateNotAfter": "2022-07-30T11:12:48Z",
"externalAccess": "Disabled"
},
"domainSecuritySettings": {
"ntlmV1": "Enabled",
"tlsV1": "Enabled",
"syncNtlmPasswords": "Enabled",
"syncKerberosPasswords": "Enabled",
"syncOnPremPasswords": "Enabled"
},
"filteredSync": "Enabled",
"resourceForestSettings": {
"settings": []
},
"notificationSettings": {
"notifyGlobalAdmins": "Disabled",
"notifyDcAdmins": "Disabled",
"additionalRecipients": [
"xxxx@contoso.com"
]
},
"sku": "Standard",
"provisioningState": "Accepted"
}
}
DEBUG: ResponseCreated:
DEBUG: DelayBeforePolling: Delaying 30 seconds before polling.
DEBUG: ============================ HTTP REQUEST ============================
HTTP Method:
GET
Absolute Uri:
https://management.azure.com/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx?api-version=2020-01-01
Headers:
x-ms-unique-id : 15
x-ms-client-request-id : 3713ef7f-e743-49ec-9713-9cd67e295827
CommandName : Az.ADDomainServices.internal\Update-AzADDomainService
FullCommandName : Update-AzADDomainService_UpdateViaIdentityExpanded
ParameterSetName : __AllParameterSets
User-Agent : AzurePowershell/v7.1.0,PSVersion/v7.2.3,Az.ADDomainServices/0.1.0
Body:
DEBUG: ============================ HTTP RESPONSE ============================
Status Code:
OK
Headers:
Cache-Control : no-cache
Pragma : no-cache
x-ms-request-id : 2a4fd315-f986-4f22-8d07-3954451cd4eb
x-ms-ratelimit-remaining-subscription-reads: 11999
x-ms-correlation-request-id : e1b62e5a-91f1-489a-af21-1a92a78e71e8
x-ms-routing-request-id : FRANCECENTRAL:20220509T093704Z:e1b62e5a-91f1-489a-af21-1a92a78e71e8
Strict-Transport-Security : max-age=31536000; includeSubDomains
X-Content-Type-Options : nosniff
Date : Mon, 09 May 2022 09:37:03 GMT
Body:
{
"id": "/subscriptions/xxxx/providers/Microsoft.AAD/locations/westeurope/operationResults/xxxx",
"name": "xxxx",
"status": "Failed",
"startTime": "0001-01-01T08:00:00Z",
"endTime": "0001-01-01T08:00:00Z",
"percentComplete": 0.0,
"error": {
"code": "BadRequest",
"message": "The certificate or password provided for LDAPs is invalid."
}
}
DEBUG: Polling:
DEBUG: Finally:
Update-AzADDomainService_UpdateViaIdentityExpanded: C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283:13
Line |
283 | Az.ADDomainServices.internal\Update-AzADDomainService @PS …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| The certificate or password provided for LDAPs is invalid.
DEBUG: [CmdletProcessRecordAsyncEnd]: Finish HTTP process
DEBUG: CmdletProcessRecordAsyncEnd:
DEBUG: CmdletProcessRecordEnd:
DEBUG: AzureQoSEvent: Module: Az.ADDomainServices:0.1.0; CommandName: Update-AzADDomainService_UpdateViaIdentityExpanded; PSVersion: 7.2.3; IsSuccess: True; Duration: 00:00:00
DEBUG: Finish sending metric.
DEBUG: CmdletEndProcessing:
```
### Environment data
```PowerShell
Name Value
---- -----
PSVersion 7.2.3
PSEdition Core
GitCommitId 7.2.3
OS Microsoft Windows 10.0.22000
Platform Win32NT
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0…}
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
WSManStackVersion 3.0
```
### Module versions
```PowerShell
ModuleType Version PreRelease Name ExportedCommands
---------- ------- ---------- ---- ----------------
Script 2.7.5 Az.Accounts {Add-AzEnvironment, Clear-AzContext, Clear-AzDefault, Connect-AzAccount…}
Script 0.1.0 Az.ADDomainServices {Get-AzADDomainService, New-AzADDomainService, New-AzADDomainServiceForestTrust, New-AzADDomainServiceReplicaSet…}
Script 4.2.0 Az.KeyVault {Add-AzKeyVaultCertificate, Add-AzKeyVaultCertificateContact, Add-AzKeyVaultKey, Add-AzKeyVaultManagedStorageAccount…}
Script 5.2.0 Az.Resources {Export-AzResourceGroup, Export-AzTemplateSpec, Get-AzDenyAssignment, Get-AzDeployment…}
```
### Error output
```PowerShell
DEBUG: 11:41:46 - ResolveError begin processing with ParameterSet 'AnyErrorParameterSet'.
DEBUG: 11:41:46 - using account id 'xxxx@contoso.com'...
WARNING: Upcoming breaking changes in the cmdlet 'Resolve-AzError' :
The `Resolve-Error` alias will be removed in a future release. Please change any scripts that use this alias to use `Resolve-AzError` instead.
Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other information on breaking changes in Azure PowerShell.
HistoryId: 57
Message : The certificate or password provided for LDAPs is invalid.
StackTrace : at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdateViaIdentity(String viaIdentity, IDomainService body, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync
sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Cmdlets.UpdateAzADDomainService_UpdateViaIdentityExpanded.ProcessRecordAsync()
Exception : Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Runtime.UndeclaredResponseException
InvocationInfo : {Update-AzADDomainService_UpdateViaIdentityExpanded}
Line : Az.ADDomainServices.internal\Update-AzADDomainService @PSBoundParameters
Position : At C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283 char:13
+ Az.ADDomainServices.internal\Update-AzADDomainService @PS …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
HistoryId : 57
HistoryId: 54
Message : The certificate or password provided for LDAPs is invalid.
StackTrace : at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdate_Call(HttpRequestMessage request, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.AdDomainServices.DomainServicesUpdateViaIdentity(String viaIdentity, IDomainService body, Func`3 onOk, Func`3 onDefault, IEventListener eventListener, ISendAsync
sender)
at Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Cmdlets.UpdateAzADDomainService_UpdateViaIdentityExpanded.ProcessRecordAsync()
Exception : Microsoft.Azure.PowerShell.Cmdlets.ADDomainServices.Runtime.UndeclaredResponseException
InvocationInfo : {Update-AzADDomainService_UpdateViaIdentityExpanded}
Line : Az.ADDomainServices.internal\Update-AzADDomainService @PSBoundParameters
Position : At C:\Users\xxxx\Documents\PowerShell\Modules\Az.ADDomainServices\0.1.0\custom\Update-AzADDomainService.ps1:283 char:13
+ Az.ADDomainServices.internal\Update-AzADDomainService @PS …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
HistoryId : 54
The Azure PowerShell team is listening, please let us know how we are doing: https://aka.ms/azpssurvey?Q_CHL=ERROR.
DEBUG: AzureQoSEvent: Module: Az.Accounts:2.7.5; CommandName: Resolve-AzError; PSVersion: 7.2.3; IsSuccess: True; Duration: 00:00:00.2753625
DEBUG: Finish sending metric.
DEBUG: 11:41:47 - ResolveError end processing.
```
|
non_process
|
update azaddomainservice the certificate or password provided for ldaps is invalid description hi i m trying to use preview of module az addomainservices in actual version i just try to update ldaps certificate with a simple command like ps adds rg name rg test adds domain contoso com ldaps pfx path c temp certificate pfx ldaps pfx pass mystrongpassword convertto securestring force asplaintext adds rg get azresourcegroup name adds rg name adds get azaddomainservice name adds domain resourcegroupname adds rg resourcegroupname ldaps update update azaddomainservice inputobject adds ldapsettingldaps true ldapsettingpfxcertificate ldaps pfx path ldapsettingpfxcertificatepassword ldaps pfx pass but i ve this error update azaddomainservice updateviaidentityexpanded script line az addomainservices internal update azaddomainservice ps … the certificate or password provided for ldaps is invalid if i use the same file and password directly from gui all is working as expected thanks regards alexandre issue script debug output powershell http method patch absolute uri headers x ms unique id x ms client request id commandname az addomainservices internal update azaddomainservice fullcommandname update azaddomainservice updateviaidentityexpanded parametersetname allparametersets user agent azurepowershell psversion az addomainservices body properties ldapssettings ldaps true pfxcertificate c temp certificate pfx pfxcertificatepassword mystrongpassword debug beforecall debug http response status code accepted headers cache control no cache pragma no cache etag w datetime location x ms request id azure asyncoperation x ms ratelimit remaining subscription writes x ms correlation request id x ms routing request id francecentral strict transport security max age includesubdomains x content type options nosniff date mon may gmt body id subscriptions xxxx resourcegroups xxxx providers microsoft aad domainservices contoso com name contoso com type microsoft aad domainservices etag w datetime location westeurope properties version tenantid xxxx domainname contoso com deploymentid aeec syncowner aeec replicasets replicasetid aeec location west europe subnetid subscriptions xxxx resourcegroups xxxx providers microsoft network virtualnetworks xxxx subnets xxxx domaincontrolleripaddress x x x x externalaccessipaddress servicestatus running ldapssettings ldaps enabled publiccertificate xxxx certificatethumbprint xxxx certificatenotafter externalaccess disabled domainsecuritysettings enabled enabled syncntlmpasswords enabled synckerberospasswords enabled synconprempasswords enabled filteredsync enabled resourceforestsettings settings notificationsettings notifyglobaladmins disabled notifydcadmins disabled additionalrecipients xxxx contoso com sku standard provisioningstate accepted debug responsecreated debug delaybeforepolling delaying seconds before polling debug http request http method get absolute uri headers x ms unique id x ms client request id commandname az addomainservices internal update azaddomainservice fullcommandname update azaddomainservice updateviaidentityexpanded parametersetname allparametersets user agent azurepowershell psversion az addomainservices body debug http response status code ok headers cache control no cache pragma no cache x ms request id x ms ratelimit remaining subscription reads x ms correlation request id x ms routing request id francecentral strict transport security max age includesubdomains x content type options nosniff date mon may gmt body id subscriptions xxxx providers microsoft aad locations westeurope operationresults xxxx name xxxx status failed starttime endtime percentcomplete error code badrequest message the certificate or password provided for ldaps is invalid debug polling debug finally update azaddomainservice updateviaidentityexpanded c users xxxx documents powershell modules az addomainservices custom update azaddomainservice line az addomainservices internal update azaddomainservice ps … the certificate or password provided for ldaps is invalid debug finish http process debug cmdletprocessrecordasyncend debug cmdletprocessrecordend debug azureqosevent module az addomainservices commandname update azaddomainservice updateviaidentityexpanded psversion issuccess true duration debug finish sending metric debug cmdletendprocessing environment data powershell name value psversion psedition core gitcommitid os microsoft windows platform pscompatibleversions … psremotingprotocolversion serializationversion wsmanstackversion module versions powershell moduletype version prerelease name exportedcommands script az accounts add azenvironment clear azcontext clear azdefault connect azaccount… script az addomainservices get azaddomainservice new azaddomainservice new azaddomainserviceforesttrust new azaddomainservicereplicaset… script az keyvault add azkeyvaultcertificate add azkeyvaultcertificatecontact add azkeyvaultkey add azkeyvaultmanagedstorageaccount… script az resources export azresourcegroup export aztemplatespec get azdenyassignment get azdeployment… error output powershell debug resolveerror begin processing with parameterset anyerrorparameterset debug using account id xxxx contoso com warning upcoming breaking changes in the cmdlet resolve azerror the resolve error alias will be removed in a future release please change any scripts that use this alias to use resolve azerror instead note go to for steps to suppress this breaking change warning and other information on breaking changes in azure powershell historyid message the certificate or password provided for ldaps is invalid stacktrace at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdate call httprequestmessage request func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdate call httprequestmessage request func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdateviaidentity string viaidentity idomainservice body func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices cmdlets updateazaddomainservice updateviaidentityexpanded processrecordasync exception microsoft azure powershell cmdlets addomainservices runtime undeclaredresponseexception invocationinfo update azaddomainservice updateviaidentityexpanded line az addomainservices internal update azaddomainservice psboundparameters position at c users xxxx documents powershell modules az addomainservices custom update azaddomainservice char az addomainservices internal update azaddomainservice ps … historyid historyid message the certificate or password provided for ldaps is invalid stacktrace at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdate call httprequestmessage request func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdate call httprequestmessage request func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices addomainservices domainservicesupdateviaidentity string viaidentity idomainservice body func onok func ondefault ieventlistener eventlistener isendasync sender at microsoft azure powershell cmdlets addomainservices cmdlets updateazaddomainservice updateviaidentityexpanded processrecordasync exception microsoft azure powershell cmdlets addomainservices runtime undeclaredresponseexception invocationinfo update azaddomainservice updateviaidentityexpanded line az addomainservices internal update azaddomainservice psboundparameters position at c users xxxx documents powershell modules az addomainservices custom update azaddomainservice char az addomainservices internal update azaddomainservice ps … historyid the azure powershell team is listening please let us know how we are doing debug azureqosevent module az accounts commandname resolve azerror psversion issuccess true duration debug finish sending metric debug resolveerror end processing
| 0
|
48,140
| 2,993,747,561
|
IssuesEvent
|
2015-07-22 07:07:03
|
fpco/ide-backend
|
https://api.github.com/repos/fpco/ide-backend
|
closed
|
Infer whether TH is used
|
enhancement Low Priority
|
At the moment, we always conversatively specify `other-extensions: TemplateHaskell` in `.cabal` files; this is necessary so that `Cabal` knows to build dynlibs first. We could instead look at the source files and detect whether TH is used, and only specify the flag if needed. Not sure this is hugely important though.
|
1.0
|
Infer whether TH is used - At the moment, we always conversatively specify `other-extensions: TemplateHaskell` in `.cabal` files; this is necessary so that `Cabal` knows to build dynlibs first. We could instead look at the source files and detect whether TH is used, and only specify the flag if needed. Not sure this is hugely important though.
|
non_process
|
infer whether th is used at the moment we always conversatively specify other extensions templatehaskell in cabal files this is necessary so that cabal knows to build dynlibs first we could instead look at the source files and detect whether th is used and only specify the flag if needed not sure this is hugely important though
| 0
|
140,760
| 12,947,176,709
|
IssuesEvent
|
2020-07-18 22:08:32
|
hlfsousa/ncml-binding
|
https://api.github.com/repos/hlfsousa/ncml-binding
|
closed
|
Site documentation
|
documentation enhancement
|
Public GitHub projects are provided with a Wiki page. This is where documentation will be started until this project is big enough to be moved to its own website. The (new) goal here is to get the documentation going in the Wiki.
Describe how code is generated, and change the generation itself (if needed) to match.
Pass criteria:
1. Documentation can be accessed from GitHub or another site.
2. Documentation includes basic information on how to generate the model.
3. Documentation contains guide for writing issues (must contain pass criteria from which tests can be derived).
|
1.0
|
Site documentation - Public GitHub projects are provided with a Wiki page. This is where documentation will be started until this project is big enough to be moved to its own website. The (new) goal here is to get the documentation going in the Wiki.
Describe how code is generated, and change the generation itself (if needed) to match.
Pass criteria:
1. Documentation can be accessed from GitHub or another site.
2. Documentation includes basic information on how to generate the model.
3. Documentation contains guide for writing issues (must contain pass criteria from which tests can be derived).
|
non_process
|
site documentation public github projects are provided with a wiki page this is where documentation will be started until this project is big enough to be moved to its own website the new goal here is to get the documentation going in the wiki describe how code is generated and change the generation itself if needed to match pass criteria documentation can be accessed from github or another site documentation includes basic information on how to generate the model documentation contains guide for writing issues must contain pass criteria from which tests can be derived
| 0
|
159,801
| 20,085,912,949
|
IssuesEvent
|
2022-02-05 01:11:28
|
DavidSpek/kale
|
https://api.github.com/repos/DavidSpek/kale
|
opened
|
CVE-2022-21731 (Medium) detected in tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl, tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl
|
security vulnerability
|
## CVE-2022-21731 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b>, <b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>
<details><summary><b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /examples/dog-breed-classification/requirements/requirements.txt</p>
<p>Path to vulnerable library: /kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</details>
<details><summary><b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /examples/taxi-cab-classification/requirements.txt</p>
<p>Path to vulnerable library: /examples/taxi-cab-classification/requirements.txt</p>
<p>
Dependency Hierarchy:
- tfx_bsl-0.21.4-cp27-cp27mu-manylinux2010_x86_64.whl (Root Library)
- :x: **tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Tensorflow is an Open Source Machine Learning Framework. The implementation of shape inference for `ConcatV2` can be used to trigger a denial of service attack via a segfault caused by a type confusion. The `axis` argument is translated into `concat_dim` in the `ConcatShapeHelper` helper function. Then, a value for `min_rank` is computed based on `concat_dim`. This is then used to validate that the `values` tensor has at least the required rank. However, `WithRankAtLeast` receives the lower bound as a 64-bits value and then compares it against the maximum 32-bits integer value that could be represented. Due to the fact that `min_rank` is a 32-bits value and the value of `axis`, the `rank` argument is a negative value, so the error check is bypassed. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range.
<p>Publish Date: 2022-02-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21731>CVE-2022-21731</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-m4hf-j54p-p353">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-m4hf-j54p-p353</a></p>
<p>Release Date: 2022-02-03</p>
<p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1;tensorflow-cpu - 2.5.3,2.6.3,2.7.1;tensorflow-gpu - 2.5.3,2.6.3,2.7.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-21731 (Medium) detected in tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl, tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl - ## CVE-2022-21731 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b>, <b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>
<details><summary><b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /examples/dog-breed-classification/requirements/requirements.txt</p>
<p>Path to vulnerable library: /kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</details>
<details><summary><b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /examples/taxi-cab-classification/requirements.txt</p>
<p>Path to vulnerable library: /examples/taxi-cab-classification/requirements.txt</p>
<p>
Dependency Hierarchy:
- tfx_bsl-0.21.4-cp27-cp27mu-manylinux2010_x86_64.whl (Root Library)
- :x: **tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Tensorflow is an Open Source Machine Learning Framework. The implementation of shape inference for `ConcatV2` can be used to trigger a denial of service attack via a segfault caused by a type confusion. The `axis` argument is translated into `concat_dim` in the `ConcatShapeHelper` helper function. Then, a value for `min_rank` is computed based on `concat_dim`. This is then used to validate that the `values` tensor has at least the required rank. However, `WithRankAtLeast` receives the lower bound as a 64-bits value and then compares it against the maximum 32-bits integer value that could be represented. Due to the fact that `min_rank` is a 32-bits value and the value of `axis`, the `rank` argument is a negative value, so the error check is bypassed. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range.
<p>Publish Date: 2022-02-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21731>CVE-2022-21731</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-m4hf-j54p-p353">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-m4hf-j54p-p353</a></p>
<p>Release Date: 2022-02-03</p>
<p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1;tensorflow-cpu - 2.5.3,2.6.3,2.7.1;tensorflow-gpu - 2.5.3,2.6.3,2.7.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in tensorflow whl tensorflow whl cve medium severity vulnerability vulnerable libraries tensorflow whl tensorflow whl tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file examples dog breed classification requirements requirements txt path to vulnerable library kale examples dog breed classification requirements requirements txt dependency hierarchy x tensorflow whl vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file examples taxi cab classification requirements txt path to vulnerable library examples taxi cab classification requirements txt dependency hierarchy tfx bsl whl root library x tensorflow whl vulnerable library found in base branch master vulnerability details tensorflow is an open source machine learning framework the implementation of shape inference for can be used to trigger a denial of service attack via a segfault caused by a type confusion the axis argument is translated into concat dim in the concatshapehelper helper function then a value for min rank is computed based on concat dim this is then used to validate that the values tensor has at least the required rank however withrankatleast receives the lower bound as a bits value and then compares it against the maximum bits integer value that could be represented due to the fact that min rank is a bits value and the value of axis the rank argument is a negative value so the error check is bypassed the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow and tensorflow as these are also affected and still in supported range publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource
| 0
|
350,208
| 31,862,555,994
|
IssuesEvent
|
2023-09-15 12:02:49
|
recommenders-team/recommenders
|
https://api.github.com/repos/recommenders-team/recommenders
|
closed
|
[FEATURE] better name for test_dataset.py
|
enhancement test low priority
|
### Description
test_dataset.py name is confusing. It actually tests `download_utils`.
We may consider renaming it to `test_download_utils.py`
|
1.0
|
[FEATURE] better name for test_dataset.py - ### Description
test_dataset.py name is confusing. It actually tests `download_utils`.
We may consider renaming it to `test_download_utils.py`
|
non_process
|
better name for test dataset py description test dataset py name is confusing it actually tests download utils we may consider renaming it to test download utils py
| 0
|
447,911
| 12,907,112,333
|
IssuesEvent
|
2020-07-15 03:51:29
|
EngineerSuite2/FantasyModuleParser
|
https://api.github.com/repos/EngineerSuite2/FantasyModuleParser
|
opened
|
Resistances Tab not loading correctly
|
Medium Priority
|
Resistances ComboBoxes don't load correctly until you hit 'New NPC'
|
1.0
|
Resistances Tab not loading correctly - Resistances ComboBoxes don't load correctly until you hit 'New NPC'
|
non_process
|
resistances tab not loading correctly resistances comboboxes don t load correctly until you hit new npc
| 0
|
3,037
| 6,039,625,091
|
IssuesEvent
|
2017-06-10 05:24:30
|
HouraiTeahouse/FantasyCrescendo
|
https://api.github.com/repos/HouraiTeahouse/FantasyCrescendo
|
closed
|
Update Forest of Magic lighting data
|
Category:Stages Type:Process
|
The Unity 5.6 Editor is currently issuing a warning saying that the lighting data is not valid for the current Unity version, and needs to be updated.
|
1.0
|
Update Forest of Magic lighting data - The Unity 5.6 Editor is currently issuing a warning saying that the lighting data is not valid for the current Unity version, and needs to be updated.
|
process
|
update forest of magic lighting data the unity editor is currently issuing a warning saying that the lighting data is not valid for the current unity version and needs to be updated
| 1
|
276,457
| 20,984,502,720
|
IssuesEvent
|
2022-03-29 00:36:05
|
fga-eps-mds/2021.2-INDICAA
|
https://api.github.com/repos/fga-eps-mds/2021.2-INDICAA
|
closed
|
Mapear dados de quantidade de alunos matriculados
|
documentation back-end
|
# Descrição
Nesta issue serão mapeados e documentados os dados a serem lidos, relacionados à quantidade de alunos matriculados, para facilitar o tratamento de dados e o web scraping.
# Tarefas
- [x] Criar uma documentação para receber os dados analisados (Em markdown ou em forma de tabela).
- [x] Analisar os dados de quantidade de alunos matriculados na lista de oferta do SIGAA.
- [x] Estabelecer padrões observados.
- [x] Preencher a documentação criada com os padrões observados.
# Critérios de aceitação
- [x] Tabela criada e acessível à todos (em forma de link).
- [x] Revisão dos padrões observados.
|
1.0
|
Mapear dados de quantidade de alunos matriculados - # Descrição
Nesta issue serão mapeados e documentados os dados a serem lidos, relacionados à quantidade de alunos matriculados, para facilitar o tratamento de dados e o web scraping.
# Tarefas
- [x] Criar uma documentação para receber os dados analisados (Em markdown ou em forma de tabela).
- [x] Analisar os dados de quantidade de alunos matriculados na lista de oferta do SIGAA.
- [x] Estabelecer padrões observados.
- [x] Preencher a documentação criada com os padrões observados.
# Critérios de aceitação
- [x] Tabela criada e acessível à todos (em forma de link).
- [x] Revisão dos padrões observados.
|
non_process
|
mapear dados de quantidade de alunos matriculados descrição nesta issue serão mapeados e documentados os dados a serem lidos relacionados à quantidade de alunos matriculados para facilitar o tratamento de dados e o web scraping tarefas criar uma documentação para receber os dados analisados em markdown ou em forma de tabela analisar os dados de quantidade de alunos matriculados na lista de oferta do sigaa estabelecer padrões observados preencher a documentação criada com os padrões observados critérios de aceitação tabela criada e acessível à todos em forma de link revisão dos padrões observados
| 0
|
691,271
| 23,690,768,043
|
IssuesEvent
|
2022-08-29 10:34:06
|
bryntum/support
|
https://api.github.com/repos/bryntum/support
|
closed
|
`AjaxStore` `beforeRequest` doesn't allow to make changes in request body
|
bug resolved high-priority forum
|
[Forum post](https://www.bryntum.com/forum/viewtopic.php?f=44&t=21918&p=108362#p108362)
https://bryntum.com/docs/scheduler/api/Core/data/AjaxStore#event-beforeRequest has `body` param that contains request body. But changes applied on that won't be catch.
More than that if you change `read` http method to POST using our public api, body won't be send to `beforeRequest` as well as somehow checked/generated in the code.
It should be working.
Code for test
```
eventStore : {
readUrl: '/url',
updateUrl: '/url',
autoLoad : true,
useRestfulMethods : true,
httpMethods : {
create : 'POST',
read : 'POST',
update : 'PATCH',
delete : 'DELETE'
},
headers : {
'Content-Type': 'application/json'
},
listeners : {
beforeRequest : (event) => {
debugger
event.body = {hello: 1};
}
}
},
```
|
1.0
|
`AjaxStore` `beforeRequest` doesn't allow to make changes in request body - [Forum post](https://www.bryntum.com/forum/viewtopic.php?f=44&t=21918&p=108362#p108362)
https://bryntum.com/docs/scheduler/api/Core/data/AjaxStore#event-beforeRequest has `body` param that contains request body. But changes applied on that won't be catch.
More than that if you change `read` http method to POST using our public api, body won't be send to `beforeRequest` as well as somehow checked/generated in the code.
It should be working.
Code for test
```
eventStore : {
readUrl: '/url',
updateUrl: '/url',
autoLoad : true,
useRestfulMethods : true,
httpMethods : {
create : 'POST',
read : 'POST',
update : 'PATCH',
delete : 'DELETE'
},
headers : {
'Content-Type': 'application/json'
},
listeners : {
beforeRequest : (event) => {
debugger
event.body = {hello: 1};
}
}
},
```
|
non_process
|
ajaxstore beforerequest doesn t allow to make changes in request body has body param that contains request body but changes applied on that won t be catch more than that if you change read http method to post using our public api body won t be send to beforerequest as well as somehow checked generated in the code it should be working code for test eventstore readurl url updateurl url autoload true userestfulmethods true httpmethods create post read post update patch delete delete headers content type application json listeners beforerequest event debugger event body hello
| 0
|
84,603
| 24,360,173,680
|
IssuesEvent
|
2022-10-03 10:59:23
|
speedb-io/speedb
|
https://api.github.com/repos/speedb-io/speedb
|
closed
|
makefile: speed up test runs startup time
|
enhancement Upstreamable build
|
The Makefile has many places where it needlessly invokes a sub-make instead of declaring the dependencies correctly. This is causing slowdowns on startup, and especially noticeable during test runs (`make check`) where each time a sub-make is invoked it regenerates `make_config.mk` and that takes a long time. Rework the dependency graph so that at least running tests doesn't involve invoking make three times.
|
1.0
|
makefile: speed up test runs startup time - The Makefile has many places where it needlessly invokes a sub-make instead of declaring the dependencies correctly. This is causing slowdowns on startup, and especially noticeable during test runs (`make check`) where each time a sub-make is invoked it regenerates `make_config.mk` and that takes a long time. Rework the dependency graph so that at least running tests doesn't involve invoking make three times.
|
non_process
|
makefile speed up test runs startup time the makefile has many places where it needlessly invokes a sub make instead of declaring the dependencies correctly this is causing slowdowns on startup and especially noticeable during test runs make check where each time a sub make is invoked it regenerates make config mk and that takes a long time rework the dependency graph so that at least running tests doesn t involve invoking make three times
| 0
|
6,194
| 9,104,703,575
|
IssuesEvent
|
2019-02-20 18:52:14
|
PennyDreadfulMTG/perf-reports
|
https://api.github.com/repos/PennyDreadfulMTG/perf-reports
|
closed
|
500 error at /api/gitpull
|
CalledProcessError decksite wontfix
|
Command '['git', 'fetch']' returned non-zero exit status 1.
Reported on decksite by logged_out
```
--------------------------------------------------------------------------------
Request Method: POST
Path: /api/gitpull?
Cookies: {}
Endpoint: process_github_webhook
View Args: {}
Person: logged_out
Referrer: None
Request Data: {}
Host: pennydreadfulmagic.com
Accept-Encoding: gzip
Cf-Ipcountry: US
X-Forwarded-For: 192.30.252.44, 162.158.79.247
Cf-Ray: 489af7c70d449580-IAD
X-Forwarded-Proto: https
Cf-Visitor: {"scheme":"https"}
Accept: */*
User-Agent: GitHub-Hookshot/233462e
X-Github-Event: push
X-Github-Delivery: 29097184-0097-11e9-959f-53f3397fffd9
Content-Type: application/json
Cf-Connecting-Ip: 192.30.252.44
X-Forwarded-Host: pennydreadfulmagic.com
X-Forwarded-Server: pennydreadfulmagic.com
Connection: Keep-Alive
Content-Length: 11139
```
--------------------------------------------------------------------------------
CalledProcessError
Command '['git', 'fetch']' returned non-zero exit status 1.
Stack Trace:
```
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2309, in __call__
return self.wsgi_app(environ, start_response)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
response = self.handle_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/discord/.local/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "./shared_web/api.py", line 17, in process_github_webhook
subprocess.check_output(['git', 'fetch'])
File "/usr/lib64/python3.6/subprocess.py", line 336, in check_output
**kwargs).stdout
File "/usr/lib64/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
```
Exception_hash: 70967dd4e1f4507c4f9cf6bc2dd34d3088f174f1
|
1.0
|
500 error at /api/gitpull - Command '['git', 'fetch']' returned non-zero exit status 1.
Reported on decksite by logged_out
```
--------------------------------------------------------------------------------
Request Method: POST
Path: /api/gitpull?
Cookies: {}
Endpoint: process_github_webhook
View Args: {}
Person: logged_out
Referrer: None
Request Data: {}
Host: pennydreadfulmagic.com
Accept-Encoding: gzip
Cf-Ipcountry: US
X-Forwarded-For: 192.30.252.44, 162.158.79.247
Cf-Ray: 489af7c70d449580-IAD
X-Forwarded-Proto: https
Cf-Visitor: {"scheme":"https"}
Accept: */*
User-Agent: GitHub-Hookshot/233462e
X-Github-Event: push
X-Github-Delivery: 29097184-0097-11e9-959f-53f3397fffd9
Content-Type: application/json
Cf-Connecting-Ip: 192.30.252.44
X-Forwarded-Host: pennydreadfulmagic.com
X-Forwarded-Server: pennydreadfulmagic.com
Connection: Keep-Alive
Content-Length: 11139
```
--------------------------------------------------------------------------------
CalledProcessError
Command '['git', 'fetch']' returned non-zero exit status 1.
Stack Trace:
```
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2309, in __call__
return self.wsgi_app(environ, start_response)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2295, in wsgi_app
response = self.handle_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 2292, in wsgi_app
response = self.full_dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1815, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1718, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/discord/.local/lib/python3.6/site-packages/flask/_compat.py", line 35, in reraise
raise value
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1813, in full_dispatch_request
rv = self.dispatch_request()
File "/home/discord/.local/lib/python3.6/site-packages/flask/app.py", line 1799, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "./shared_web/api.py", line 17, in process_github_webhook
subprocess.check_output(['git', 'fetch'])
File "/usr/lib64/python3.6/subprocess.py", line 336, in check_output
**kwargs).stdout
File "/usr/lib64/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
```
Exception_hash: 70967dd4e1f4507c4f9cf6bc2dd34d3088f174f1
|
process
|
error at api gitpull command returned non zero exit status reported on decksite by logged out request method post path api gitpull cookies endpoint process github webhook view args person logged out referrer none request data host pennydreadfulmagic com accept encoding gzip cf ipcountry us x forwarded for cf ray iad x forwarded proto https cf visitor scheme https accept user agent github hookshot x github event push x github delivery content type application json cf connecting ip x forwarded host pennydreadfulmagic com x forwarded server pennydreadfulmagic com connection keep alive content length calledprocesserror command returned non zero exit status stack trace file home discord local lib site packages flask app py line in call return self wsgi app environ start response file home discord local lib site packages flask app py line in wsgi app response self handle exception e file home discord local lib site packages flask app py line in wsgi app response self full dispatch request file home discord local lib site packages flask app py line in full dispatch request rv self handle user exception e file home discord local lib site packages flask app py line in handle user exception reraise exc type exc value tb file home discord local lib site packages flask compat py line in reraise raise value file home discord local lib site packages flask app py line in full dispatch request rv self dispatch request file home discord local lib site packages flask app py line in dispatch request return self view functions req view args file shared web api py line in process github webhook subprocess check output file usr subprocess py line in check output kwargs stdout file usr subprocess py line in run output stdout stderr stderr exception hash
| 1
|
18,402
| 24,540,778,155
|
IssuesEvent
|
2022-10-12 03:24:05
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
TIN mesh creation causes crash in Model Designer
|
Feedback stale Processing Bug
|
### What is the bug or the crash?
Using the TIN mesh creation algorithm outside of a model works. First run of the model works. 2nd and subsequent attempts give errors in this tool. Any attempt to edit this tool in the model causes QGIS to crash.
## Report Details
**Python Stack Trace**
```
Windows fatal exception: access violation
Current thread 0x00001db4 (most recent call first):
File "C:\PROGRA~1/QGIS32~1.1/apps/qgis/./python/plugins\processing\modeler\ModelerGraphicItem.py", line 157 in edit
if dlg.exec_():
File "C:\PROGRA~1/QGIS32~1.1/apps/qgis/./python/plugins\processing\modeler\ModelerGraphicItem.py", line 178 in editComponent
self.edit()
```
**Stack Trace**
No stack trace is available.
**QGIS Info**
QGIS Version: 3.24.1-Tisler
QGIS code revision: 5709b824
Compiled against Qt: 5.15.2
Running against Qt: 5.15.2
Compiled against GDAL: 3.4.1
Running against GDAL: 3.4.1
**System Info**
CPU Type: x86_64
Kernel Type: winnt
Kernel Version: 10.0.19044
### Steps to reproduce the issue
This is the model:
[volume from DEM2.pdf](https://github.com/qgis/QGIS/files/9309187/volume.from.DEM2.pdf)
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.24.1-Tisler | QGIS code revision | 5709b824
-- | -- | -- | --
Qt version | 5.15.2
Python version | 3.9.5
GDAL/OGR version | 3.4.1
PROJ version | 8.2.1
EPSG Registry database version | v10.041 (2021-12-03)
GEOS version | 3.10.2-CAPI-1.16.0
SQLite version | 3.37.2
PDAL version | 2.3.0
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.1.3
QScintilla2 version | 2.11.5
OS version | Windows 10 Version 2009
| | |
Active Python plugins
clipper | 1.2
dissolve_adjacent_polygons | 0.1
FreehandRasterGeoreferencer | 0.8.3
MagicWand-master | 1.3.1
mmqgis | 2021.9.10
networks | 2.6.8
NNJoin | 3.1.3
parcel_plugin | 3.8
processing_saga_nextgen | 0.0.7
profiletool | 4.2.2
Qgis2threejs | 2.7.1
qgis2web | 3.16.0
qgsAzimuth | 0.9.15
quick_map_services | 0.19.29
SelectWithin | 0.4
shapetools | 3.4.6
tc_tlag | 0.1
valuetool | 3.0.15
volume_calculation_tool | 0.4
processing | 2.12.99
sagaprovider | 2.12.99
</body></html><!--EndFragment-->QGIS version
3.24.1-Tisler
QGIS code revision
[5709b824](https://github.com/qgis/QGIS/commit/5709b824)
Qt version
5.15.2
Python version
3.9.5
GDAL/OGR version
3.4.1
PROJ version
8.2.1
EPSG Registry database version
v10.041 (2021-12-03)
GEOS version
3.10.2-CAPI-1.16.0
SQLite version
3.37.2
PDAL version
2.3.0
PostgreSQL client version
unknown
SpatiaLite version
5.0.1
QWT version
6.1.3
QScintilla2 version
2.11.5
OS version
Windows 10 Version 2009
Active Python plugins
clipper
1.2
dissolve_adjacent_polygons
0.1
FreehandRasterGeoreferencer
0.8.3
MagicWand-master
1.3.1
mmqgis
2021.9.10
networks
2.6.8
NNJoin
3.1.3
parcel_plugin
3.8
processing_saga_nextgen
0.0.7
profiletool
4.2.2
Qgis2threejs
2.7.1
qgis2web
3.16.0
qgsAzimuth
0.9.15
quick_map_services
0.19.29
SelectWithin
0.4
shapetools
3.4.6
tc_tlag
0.1
valuetool
3.0.15
volume_calculation_tool
0.4
processing
2.12.99
sagaprovider
2.12.99
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
TIN mesh creation causes crash in Model Designer - ### What is the bug or the crash?
Using the TIN mesh creation algorithm outside of a model works. First run of the model works. 2nd and subsequent attempts give errors in this tool. Any attempt to edit this tool in the model causes QGIS to crash.
## Report Details
**Python Stack Trace**
```
Windows fatal exception: access violation
Current thread 0x00001db4 (most recent call first):
File "C:\PROGRA~1/QGIS32~1.1/apps/qgis/./python/plugins\processing\modeler\ModelerGraphicItem.py", line 157 in edit
if dlg.exec_():
File "C:\PROGRA~1/QGIS32~1.1/apps/qgis/./python/plugins\processing\modeler\ModelerGraphicItem.py", line 178 in editComponent
self.edit()
```
**Stack Trace**
No stack trace is available.
**QGIS Info**
QGIS Version: 3.24.1-Tisler
QGIS code revision: 5709b824
Compiled against Qt: 5.15.2
Running against Qt: 5.15.2
Compiled against GDAL: 3.4.1
Running against GDAL: 3.4.1
**System Info**
CPU Type: x86_64
Kernel Type: winnt
Kernel Version: 10.0.19044
### Steps to reproduce the issue
This is the model:
[volume from DEM2.pdf](https://github.com/qgis/QGIS/files/9309187/volume.from.DEM2.pdf)
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.24.1-Tisler | QGIS code revision | 5709b824
-- | -- | -- | --
Qt version | 5.15.2
Python version | 3.9.5
GDAL/OGR version | 3.4.1
PROJ version | 8.2.1
EPSG Registry database version | v10.041 (2021-12-03)
GEOS version | 3.10.2-CAPI-1.16.0
SQLite version | 3.37.2
PDAL version | 2.3.0
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.1.3
QScintilla2 version | 2.11.5
OS version | Windows 10 Version 2009
| | |
Active Python plugins
clipper | 1.2
dissolve_adjacent_polygons | 0.1
FreehandRasterGeoreferencer | 0.8.3
MagicWand-master | 1.3.1
mmqgis | 2021.9.10
networks | 2.6.8
NNJoin | 3.1.3
parcel_plugin | 3.8
processing_saga_nextgen | 0.0.7
profiletool | 4.2.2
Qgis2threejs | 2.7.1
qgis2web | 3.16.0
qgsAzimuth | 0.9.15
quick_map_services | 0.19.29
SelectWithin | 0.4
shapetools | 3.4.6
tc_tlag | 0.1
valuetool | 3.0.15
volume_calculation_tool | 0.4
processing | 2.12.99
sagaprovider | 2.12.99
</body></html><!--EndFragment-->QGIS version
3.24.1-Tisler
QGIS code revision
[5709b824](https://github.com/qgis/QGIS/commit/5709b824)
Qt version
5.15.2
Python version
3.9.5
GDAL/OGR version
3.4.1
PROJ version
8.2.1
EPSG Registry database version
v10.041 (2021-12-03)
GEOS version
3.10.2-CAPI-1.16.0
SQLite version
3.37.2
PDAL version
2.3.0
PostgreSQL client version
unknown
SpatiaLite version
5.0.1
QWT version
6.1.3
QScintilla2 version
2.11.5
OS version
Windows 10 Version 2009
Active Python plugins
clipper
1.2
dissolve_adjacent_polygons
0.1
FreehandRasterGeoreferencer
0.8.3
MagicWand-master
1.3.1
mmqgis
2021.9.10
networks
2.6.8
NNJoin
3.1.3
parcel_plugin
3.8
processing_saga_nextgen
0.0.7
profiletool
4.2.2
Qgis2threejs
2.7.1
qgis2web
3.16.0
qgsAzimuth
0.9.15
quick_map_services
0.19.29
SelectWithin
0.4
shapetools
3.4.6
tc_tlag
0.1
valuetool
3.0.15
volume_calculation_tool
0.4
processing
2.12.99
sagaprovider
2.12.99
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
tin mesh creation causes crash in model designer what is the bug or the crash using the tin mesh creation algorithm outside of a model works first run of the model works and subsequent attempts give errors in this tool any attempt to edit this tool in the model causes qgis to crash report details python stack trace windows fatal exception access violation current thread most recent call first file c progra apps qgis python plugins processing modeler modelergraphicitem py line in edit if dlg exec file c progra apps qgis python plugins processing modeler modelergraphicitem py line in editcomponent self edit stack trace no stack trace is available qgis info qgis version tisler qgis code revision compiled against qt running against qt compiled against gdal running against gdal system info cpu type kernel type winnt kernel version steps to reproduce the issue this is the model versions doctype html public dtd html en p li white space pre wrap qgis version tisler qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version active python plugins clipper dissolve adjacent polygons freehandrastergeoreferencer magicwand master mmqgis networks nnjoin parcel plugin processing saga nextgen profiletool qgsazimuth quick map services selectwithin shapetools tc tlag valuetool volume calculation tool processing sagaprovider qgis version tisler qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version active python plugins clipper dissolve adjacent polygons freehandrastergeoreferencer magicwand master mmqgis networks nnjoin parcel plugin processing saga nextgen profiletool qgsazimuth quick map services selectwithin shapetools tc tlag valuetool volume calculation tool processing sagaprovider supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
346,517
| 30,924,200,999
|
IssuesEvent
|
2023-08-06 09:27:15
|
warendy/warendy_BE
|
https://api.github.com/repos/warendy/warendy_BE
|
closed
|
TEST: 회원 CRUD
|
test
|
**목적**
의뢰의 인한 회원 기능을 테스트 합니다.
**테스트 코드**
- [ ]
- [ ] 회원 로그인
- [ ] 회원 정보수정
- [ ] 회원 soft delete
- [ ] 회원 조회
|
1.0
|
TEST: 회원 CRUD - **목적**
의뢰의 인한 회원 기능을 테스트 합니다.
**테스트 코드**
- [ ]
- [ ] 회원 로그인
- [ ] 회원 정보수정
- [ ] 회원 soft delete
- [ ] 회원 조회
|
non_process
|
test 회원 crud 목적 의뢰의 인한 회원 기능을 테스트 합니다 테스트 코드 회원 로그인 회원 정보수정 회원 soft delete 회원 조회
| 0
|
21,758
| 30,276,384,410
|
IssuesEvent
|
2023-07-07 20:06:08
|
gsoft-inc/ov-igloo-ui
|
https://api.github.com/repos/gsoft-inc/ov-igloo-ui
|
closed
|
[Bug]: TagPicker item error state is not working
|
bug in process
|
### Contact Details
_No response_
### What happened?
When I set hasError on a TagPicker selectedResults item, there's no red border :

Also, when the TagPicker with error prop set to true isFocused, the border is blue instead of red :

### Component
TagPicker
### Component Version
0.3.0
### Which browsers are you seeing the problem on?
_No response_
### Mobile Device
_No response_
### Relevant log output
_No response_
|
1.0
|
[Bug]: TagPicker item error state is not working - ### Contact Details
_No response_
### What happened?
When I set hasError on a TagPicker selectedResults item, there's no red border :

Also, when the TagPicker with error prop set to true isFocused, the border is blue instead of red :

### Component
TagPicker
### Component Version
0.3.0
### Which browsers are you seeing the problem on?
_No response_
### Mobile Device
_No response_
### Relevant log output
_No response_
|
process
|
tagpicker item error state is not working contact details no response what happened when i set haserror on a tagpicker selectedresults item there s no red border also when the tagpicker with error prop set to true isfocused the border is blue instead of red component tagpicker component version which browsers are you seeing the problem on no response mobile device no response relevant log output no response
| 1
|
10,481
| 13,252,912,270
|
IssuesEvent
|
2020-08-20 06:33:02
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
in_string can be very slow
|
component/performance sig/coprocessor type/bug
|
## Bug Report
In my PC, perform `IN(90 strings)` for 1000 records takes 0.2s without index.
Tracing:
```
2018/12/25 23:01:35.163 TRCE : >>>> tikv::coprocessor::dag::dag::DAGContext::handle_request()
2018/12/25 23:01:35.163 TRCE : >>>>>>>> tikv::coprocessor::dag::executor::aggregation::HashAggExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>> tikv::coprocessor::dag::executor::aggregation::HashAggExecutor::aggregate()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::aggregation::AggExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::selection::SelectionExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner()
2018/12/25 23:01:35.163 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner = Ok(None)
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::new_scanner(range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000")
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::scanner::Scanner < S >::new(store: SnapshotStore { snapshot: RegionSnapshot, start_ts: 405209112914165761, isolation_level: SI, fill_cache: true }, scan_on: Table, desc: false, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000")
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::scanner::Scanner < S >::new = Ok(Scanner { desc: false, scan_on: Table, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000" })
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::new_scanner = Ok(Scanner { desc: false, scan_on: Table, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000" })
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner()
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::scanner::Scanner < S >::next_row()
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::scanner::Scanner < S >::next_row = Ok(Some(([116, 128, 0, 0, 0, 0, 0, 0, 31, 95, 114, 128, 0, 0, 0, 0, 0, 0, 2], [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101])))
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::cut_row(data: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9, 10})
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::RowColsDict::new(cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }}, value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::RowColsDict::new = RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::cut_row = Ok(RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} })
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::decode_handle(encoded: [116, 128, 0, 0, 0, 0, 0, 0, 31, 95, 114, 128, 0, 0, 0, 0, 0, 0, 2])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::decode_handle = Ok(2)
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::Row::origin(handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false])
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::OriginCols::new(handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::OriginCols::new = OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] }
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::Row::origin = Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner = Ok(Some(Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })))
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::next = Ok(Some(Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })))
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::Row::take_origin()
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::Row::take_origin = OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] }
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::OriginCols::inflate_cols_with_offsets(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, offsets: [1])
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::RowColsDict::get(key: 10)
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::RowColsDict::get = Some([2, 16, 119, 120, 52, 103, 51, 120, 82, 101])
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::decode_col_value(data: [2, 16, 119, 120, 52, 103, 51, 120, 82, 101], ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, col: column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false)
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::unflatten(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, datum: Bytes([119, 120, 52, 103, 51, 120, 82, 101]), field_type: column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false)
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::unflatten = Ok(Bytes([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::decode_col_value = Ok(Bytes([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::OriginCols::inflate_cols_with_offsets = Ok([Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::scalar_function::ScalarFunc::eval(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::builtin_compare::ScalarFunc::in_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::column::Column::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::column::Column::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.170 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.170 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.171 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.172 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.173 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.175 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.175 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.185 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.185 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
…………
…………
…………
```
|
1.0
|
in_string can be very slow - ## Bug Report
In my PC, perform `IN(90 strings)` for 1000 records takes 0.2s without index.
Tracing:
```
2018/12/25 23:01:35.163 TRCE : >>>> tikv::coprocessor::dag::dag::DAGContext::handle_request()
2018/12/25 23:01:35.163 TRCE : >>>>>>>> tikv::coprocessor::dag::executor::aggregation::HashAggExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>> tikv::coprocessor::dag::executor::aggregation::HashAggExecutor::aggregate()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::aggregation::AggExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::selection::SelectionExecutor::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::next()
2018/12/25 23:01:35.163 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner()
2018/12/25 23:01:35.163 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner = Ok(None)
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::new_scanner(range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000")
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::scanner::Scanner < S >::new(store: SnapshotStore { snapshot: RegionSnapshot, start_ts: 405209112914165761, isolation_level: SI, fill_cache: true }, scan_on: Table, desc: false, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000")
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::scanner::Scanner < S >::new = Ok(Scanner { desc: false, scan_on: Table, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000" })
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::new_scanner = Ok(Scanner { desc: false, scan_on: Table, key_only: false, range: start: "t\200\000\000\000\000\000\000\037_r\000\000\000\000\000\000\000\000" end: "t\200\000\000\000\000\000\000\037_r\377\377\377\377\377\377\377\377\000" })
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner()
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::scanner::Scanner < S >::next_row()
2018/12/25 23:01:35.164 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::scanner::Scanner < S >::next_row = Ok(Some(([116, 128, 0, 0, 0, 0, 0, 0, 31, 95, 114, 128, 0, 0, 0, 0, 0, 0, 2], [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101])))
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::cut_row(data: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9, 10})
2018/12/25 23:01:35.164 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::RowColsDict::new(cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }}, value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::RowColsDict::new = RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::cut_row = Ok(RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} })
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::decode_handle(encoded: [116, 128, 0, 0, 0, 0, 0, 0, 31, 95, 114, 128, 0, 0, 0, 0, 0, 0, 2])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::decode_handle = Ok(2)
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::Row::origin(handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false])
2018/12/25 23:01:35.165 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::OriginCols::new(handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false])
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::OriginCols::new = OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] }
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::Row::origin = Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })
2018/12/25 23:01:35.165 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::get_row_from_range_scanner = Ok(Some(Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })))
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::table_scan::TableScanExecutor < S >::next = Ok(Some(Origin(OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] })))
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::Row::take_origin()
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::Row::take_origin = OriginCols { handle: 2, data: RowColsDict { value: [8, 4, 5, 191, 241, 153, 153, 153, 153, 153, 154, 8, 6, 5, 191, 243, 51, 51, 51, 51, 51, 51, 8, 8, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 10, 5, 192, 36, 0, 0, 0, 0, 0, 0, 8, 12, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 14, 5, 191, 240, 0, 0, 0, 0, 0, 0, 8, 16, 9, 0, 8, 18, 8, 24, 8, 20, 2, 16, 119, 120, 52, 103, 51, 120, 82, 101], cols: {9: RowColMeta { offset: 72, length: 2 }, 10: RowColMeta { offset: 76, length: 10 }} }, cols: [column_id: 9 tp: 8 collation: 63 columnLen: 20 decimal: 0 flag: 0 pk_handle: false, column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false] }
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::executor::OriginCols::inflate_cols_with_offsets(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, offsets: [1])
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::RowColsDict::get(key: 10)
2018/12/25 23:01:35.166 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::RowColsDict::get = Some([2, 16, 119, 120, 52, 103, 51, 120, 82, 101])
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::decode_col_value(data: [2, 16, 119, 120, 52, 103, 51, 120, 82, 101], ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, col: column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false)
2018/12/25 23:01:35.166 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::codec::table::unflatten(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, datum: Bytes([119, 120, 52, 103, 51, 120, 82, 101]), field_type: column_id: 10 tp: 15 collation: 46 columnLen: 8 decimal: 0 flag: 0 pk_handle: false)
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::unflatten = Ok(Bytes([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::codec::table::decode_col_value = Ok(Bytes([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.167 TRCE : <<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::executor::OriginCols::inflate_cols_with_offsets = Ok([Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::scalar_function::ScalarFunc::eval(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::builtin_compare::ScalarFunc::in_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::column::Column::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.167 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::column::Column::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 82, 101]))
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.168 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.168 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 103]))
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 114]))
2018/12/25 23:01:35.169 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.170 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.170 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.170 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 116]))
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.171 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.171 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 117]))
2018/12/25 23:01:35.172 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.172 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.173 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.173 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 118]))
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.174 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.174 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 119]))
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.175 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.175 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.175 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 120]))
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.176 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.176 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 121]))
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 103, 122]))
2018/12/25 23:01:35.177 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 118]))
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.178 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 120]))
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.179 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.179 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 121]))
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 109, 122]))
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.180 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 110, 122]))
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.181 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.181 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 99]))
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 101]))
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.182 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 102]))
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.183 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.183 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 103]))
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 107]))
2018/12/25 23:01:35.184 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::Expression::eval_string(ctx: EvalContext { cfg: EvalConfig { tz: Tz::Name(Asia/Shanghai), ignore_truncate: false, truncate_as_warning: true, overflow_as_warning: true, in_insert_stmt: false, in_update_or_delete_stmt: false, in_select_stmt: true, pad_char_to_full_length: false, divided_by_zero_as_warning: false, max_warning_cnt: 64, sql_mode: 0, strict_sql_mode: false }, warnings: EvalWarnings { max_warning_cnt: 64, warning_cnt: 0, warnings: [] } }, row: [Null, Bytes([119, 120, 52, 103, 51, 120, 82, 101])])
2018/12/25 23:01:35.185 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Constant::eval_string()
2018/12/25 23:01:35.185 TRCE : >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tikv::coprocessor::dag::expr::constant::Datum::as_string()
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Datum::as_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::constant::Constant::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
2018/12/25 23:01:35.185 TRCE : <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< tikv::coprocessor::dag::expr::Expression::eval_string = Ok(Some([119, 120, 52, 103, 51, 120, 112, 109]))
…………
…………
…………
```
|
process
|
in string can be very slow bug report in my pc perform in strings for records takes without index tracing trce tikv coprocessor dag dag dagcontext handle request trce tikv coprocessor dag executor aggregation hashaggexecutor next trce tikv coprocessor dag executor aggregation hashaggexecutor aggregate trce tikv coprocessor dag executor aggregation aggexecutor next trce tikv coprocessor dag executor selection selectionexecutor next trce tikv coprocessor dag executor table scan tablescanexecutor next trce tikv coprocessor dag executor table scan tablescanexecutor get row from range scanner trce get row from range scanner ok none trce tikv coprocessor dag executor table scan tablescanexecutor new scanner range start t r end t r trce tikv coprocessor dag executor scanner scanner new store snapshotstore snapshot regionsnapshot start ts isolation level si fill cache true scan on table desc false key only false range start t r end t r trce new ok scanner desc false scan on table key only false range start t r end t r trce new scanner ok scanner desc false scan on table key only false range start t r end t r trce tikv coprocessor dag executor table scan tablescanexecutor get row from range scanner trce tikv coprocessor dag executor scanner scanner next row trce next row ok some trce tikv coprocessor codec table cut row data cols trce tikv coprocessor codec table rowcolsdict new cols rowcolmeta offset length rowcolmeta offset length value trce tikv coprocessor codec table rowcolsdict new rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length trce tikv coprocessor codec table cut row ok rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length trce tikv coprocessor codec table decode handle encoded trce tikv coprocessor codec table decode handle ok trce tikv coprocessor dag executor row origin handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce tikv coprocessor dag executor origincols new handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce tikv coprocessor dag executor origincols new origincols handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce tikv coprocessor dag executor row origin origin origincols handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce get row from range scanner ok some origin origincols handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce next ok some origin origincols handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce tikv coprocessor dag executor row take origin trce tikv coprocessor dag executor row take origin origincols handle data rowcolsdict value cols rowcolmeta offset length rowcolmeta offset length cols trce tikv coprocessor dag executor origincols inflate cols with offsets ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings offsets trce tikv coprocessor codec table rowcolsdict get key trce tikv coprocessor codec table rowcolsdict get some trce tikv coprocessor codec table decode col value data ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings col column id tp collation columnlen decimal flag pk handle false trce tikv coprocessor codec table unflatten ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings datum bytes field type column id tp collation columnlen decimal flag pk handle false trce tikv coprocessor codec table unflatten ok bytes trce tikv coprocessor codec table decode col value ok bytes trce tikv coprocessor dag executor origincols inflate cols with offsets ok trce tikv coprocessor dag expr expression eval ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr scalar function scalarfunc eval ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr builtin compare scalarfunc in string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr column column eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr column column eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some trce tikv coprocessor dag expr expression eval string ctx evalcontext cfg evalconfig tz tz name asia shanghai ignore truncate false truncate as warning true overflow as warning true in insert stmt false in update or delete stmt false in select stmt true pad char to full length false divided by zero as warning false max warning cnt sql mode strict sql mode false warnings evalwarnings max warning cnt warning cnt warnings row trce tikv coprocessor dag expr constant constant eval string trce tikv coprocessor dag expr constant datum as string trce tikv coprocessor dag expr constant datum as string ok some trce tikv coprocessor dag expr constant constant eval string ok some trce tikv coprocessor dag expr expression eval string ok some ………… ………… …………
| 1
|
144,505
| 13,108,686,447
|
IssuesEvent
|
2020-08-04 17:17:06
|
ctm/mb2-doc
|
https://api.github.com/repos/ctm/mb2-doc
|
closed
|
rounding difference between mb2 and Rich's numbers
|
chore documentation easy high priority
|
After cleaning up the errors that caused the prize pool sizes to be incorrect in three cases (#345), I see that there are still differences in amounts awarded, probably due to rounding (or lack thereof).
Here's what Rich sent me as the official values:
> 1 Russell "ABVidale" Fox T$4,060
> 2 John "muscatel" Grout T$3,086
> 3 John "da pickle" Pickels T$2,274
> 4 Gillian "Tegwin" Groves T$1,786
> 5 Christopher "tombayz" Mecklin T$1,462
> 6 Kenny "HoserSimpson" Shei T$1,137
> 7 Bryan "bjuliano" Juliano T$812
> 8 Tanya "MissT74" Peck-Devenport T$487
> 9 Shari "pokerchimp" Silk T$325
> 10 Michael "mjoseph" Brennan T$325
> 11 Barry "MrRaise" Kornspan T$243
> 12 John "JRX" Reed T$243
Here are the new adjusted values:
> ABVidale | 1 | 4060 | 2020-08-01 16:59:18
> muscatel | 2 | 3085 | 2020-08-01 16:59:18
> jusfoolin | 3 | 2274 | 2020-08-01 16:52:58
> ⛵tegwin | 4 | 1786 | 2020-08-01 16:45:58
> 🐢tombayz | 5 | 1461 | 2020-08-01 16:44:38
> HoserSimpson | 6 | 1137 | 2020-08-01 16:31:21
> bjuliano | 7 | 812 | 2020-08-01 16:30:06
> MissT74 | 8 | 487 | 2020-08-01 16:25:06
> 🐵 pokerchimp | 9 | 325 | 2020-08-01 16:11:55
> mjoseph | 10 | 325 | 2020-08-01 16:08:52
> MrRaise | 11 | 244 | 2020-08-01 16:03:18
> JRX | 12 | 244 | 2020-08-01 16:03:18
`Muscatel` and `tombayz` each get FM1 less and `MrRaise` and `JRX` each get FM1 more. I can dump the floating values that mb2 uses as it creates the integer values. Most likely I'll then have a hypothesis as to what is happening. I'll then ask Rich.
|
1.0
|
rounding difference between mb2 and Rich's numbers - After cleaning up the errors that caused the prize pool sizes to be incorrect in three cases (#345), I see that there are still differences in amounts awarded, probably due to rounding (or lack thereof).
Here's what Rich sent me as the official values:
> 1 Russell "ABVidale" Fox T$4,060
> 2 John "muscatel" Grout T$3,086
> 3 John "da pickle" Pickels T$2,274
> 4 Gillian "Tegwin" Groves T$1,786
> 5 Christopher "tombayz" Mecklin T$1,462
> 6 Kenny "HoserSimpson" Shei T$1,137
> 7 Bryan "bjuliano" Juliano T$812
> 8 Tanya "MissT74" Peck-Devenport T$487
> 9 Shari "pokerchimp" Silk T$325
> 10 Michael "mjoseph" Brennan T$325
> 11 Barry "MrRaise" Kornspan T$243
> 12 John "JRX" Reed T$243
Here are the new adjusted values:
> ABVidale | 1 | 4060 | 2020-08-01 16:59:18
> muscatel | 2 | 3085 | 2020-08-01 16:59:18
> jusfoolin | 3 | 2274 | 2020-08-01 16:52:58
> ⛵tegwin | 4 | 1786 | 2020-08-01 16:45:58
> 🐢tombayz | 5 | 1461 | 2020-08-01 16:44:38
> HoserSimpson | 6 | 1137 | 2020-08-01 16:31:21
> bjuliano | 7 | 812 | 2020-08-01 16:30:06
> MissT74 | 8 | 487 | 2020-08-01 16:25:06
> 🐵 pokerchimp | 9 | 325 | 2020-08-01 16:11:55
> mjoseph | 10 | 325 | 2020-08-01 16:08:52
> MrRaise | 11 | 244 | 2020-08-01 16:03:18
> JRX | 12 | 244 | 2020-08-01 16:03:18
`Muscatel` and `tombayz` each get FM1 less and `MrRaise` and `JRX` each get FM1 more. I can dump the floating values that mb2 uses as it creates the integer values. Most likely I'll then have a hypothesis as to what is happening. I'll then ask Rich.
|
non_process
|
rounding difference between and rich s numbers after cleaning up the errors that caused the prize pool sizes to be incorrect in three cases i see that there are still differences in amounts awarded probably due to rounding or lack thereof here s what rich sent me as the official values russell abvidale fox t john muscatel grout t john da pickle pickels t gillian tegwin groves t christopher tombayz mecklin t kenny hosersimpson shei t bryan bjuliano juliano t tanya peck devenport t shari pokerchimp silk t michael mjoseph brennan t barry mrraise kornspan t john jrx reed t here are the new adjusted values abvidale muscatel jusfoolin ⛵tegwin 🐢tombayz hosersimpson bjuliano 🐵 pokerchimp mjoseph mrraise jrx muscatel and tombayz each get less and mrraise and jrx each get more i can dump the floating values that uses as it creates the integer values most likely i ll then have a hypothesis as to what is happening i ll then ask rich
| 0
|
13,668
| 16,388,862,163
|
IssuesEvent
|
2021-05-17 13:53:51
|
Bedrohung-der-Bienen/Transformationsfelder-Digitalisierung
|
https://api.github.com/repos/Bedrohung-der-Bienen/Transformationsfelder-Digitalisierung
|
closed
|
Auf der Anmeldeseite sollte sich ein Login Button befinden.
|
backend bootstrap frontend javascript login process
|
# Szenario: Benutzer soll sich durch ein Login Button anmelden können
- **Gegeben** Der Benutzer ist auf der Startseite angelangt
- **Wenn** sich der Benutzer anmelden möchte
- **Dann** klickt er auf den Login Tab in der Navigation
- **Und** es öffnet sich eine Anmeldeseite
- **Und** der Benutzer meldet sich mit seinem E-Mail und Passwort an
- **Und** klickt auf dem Login Button, um sich anzumelden
Der Benutzer muss die Möglichkeit haben sich anmelden zu können, um Kommentare hinzuzufügen, eine Pflanze bewerten zu können und seine Favoritenliste einsehen zu können mit dem Login Button werden die Eingabedaten überprüft und wenn die Eingabedaten mit den Daten in der Datenbank übereinstimmen wird der Benutzer weitergeleitet.
-----
**Als** Benutzer,
**möchte ich** einen Login Button klicken können
**damit** ich mich anmelden kann.
**Szenario 1:** Der Benutzer klickt auf Login, um sich anzumelden, dabei gibt er sein Benutzername und das Passwort ein und meldet sich durch den Button an.
|
1.0
|
Auf der Anmeldeseite sollte sich ein Login Button befinden. - # Szenario: Benutzer soll sich durch ein Login Button anmelden können
- **Gegeben** Der Benutzer ist auf der Startseite angelangt
- **Wenn** sich der Benutzer anmelden möchte
- **Dann** klickt er auf den Login Tab in der Navigation
- **Und** es öffnet sich eine Anmeldeseite
- **Und** der Benutzer meldet sich mit seinem E-Mail und Passwort an
- **Und** klickt auf dem Login Button, um sich anzumelden
Der Benutzer muss die Möglichkeit haben sich anmelden zu können, um Kommentare hinzuzufügen, eine Pflanze bewerten zu können und seine Favoritenliste einsehen zu können mit dem Login Button werden die Eingabedaten überprüft und wenn die Eingabedaten mit den Daten in der Datenbank übereinstimmen wird der Benutzer weitergeleitet.
-----
**Als** Benutzer,
**möchte ich** einen Login Button klicken können
**damit** ich mich anmelden kann.
**Szenario 1:** Der Benutzer klickt auf Login, um sich anzumelden, dabei gibt er sein Benutzername und das Passwort ein und meldet sich durch den Button an.
|
process
|
auf der anmeldeseite sollte sich ein login button befinden szenario benutzer soll sich durch ein login button anmelden können gegeben der benutzer ist auf der startseite angelangt wenn sich der benutzer anmelden möchte dann klickt er auf den login tab in der navigation und es öffnet sich eine anmeldeseite und der benutzer meldet sich mit seinem e mail und passwort an und klickt auf dem login button um sich anzumelden der benutzer muss die möglichkeit haben sich anmelden zu können um kommentare hinzuzufügen eine pflanze bewerten zu können und seine favoritenliste einsehen zu können mit dem login button werden die eingabedaten überprüft und wenn die eingabedaten mit den daten in der datenbank übereinstimmen wird der benutzer weitergeleitet als benutzer möchte ich einen login button klicken können damit ich mich anmelden kann szenario der benutzer klickt auf login um sich anzumelden dabei gibt er sein benutzername und das passwort ein und meldet sich durch den button an
| 1
|
299,096
| 22,589,041,703
|
IssuesEvent
|
2022-06-28 17:54:47
|
criblio/appscope
|
https://api.github.com/repos/criblio/appscope
|
closed
|
Tweak docs per PM final 1.1.0 questions
|
documentation
|
Per @nicktank :
- Former references to "LogStream" in the config file should read "Cribl Stream or Cribl Edge."
- Tweak the Nginx example in the config file.
- Save the muting bit for later.
|
1.0
|
Tweak docs per PM final 1.1.0 questions - Per @nicktank :
- Former references to "LogStream" in the config file should read "Cribl Stream or Cribl Edge."
- Tweak the Nginx example in the config file.
- Save the muting bit for later.
|
non_process
|
tweak docs per pm final questions per nicktank former references to logstream in the config file should read cribl stream or cribl edge tweak the nginx example in the config file save the muting bit for later
| 0
|
243,263
| 26,277,993,997
|
IssuesEvent
|
2023-01-07 01:41:19
|
bharathirajatut/vulnerable-node
|
https://api.github.com/repos/bharathirajatut/vulnerable-node
|
opened
|
CVE-2017-20162 (Medium) detected in ms-0.7.2.tgz, ms-0.7.1.tgz
|
security vulnerability
|
## CVE-2017-20162 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ms-0.7.2.tgz</b>, <b>ms-0.7.1.tgz</b></p></summary>
<p>
<details><summary><b>ms-0.7.2.tgz</b></p></summary>
<p>Tiny milisecond conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.2.tgz">https://registry.npmjs.org/ms/-/ms-0.7.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/serve-favicon/node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- serve-favicon-2.3.2.tgz (Root Library)
- :x: **ms-0.7.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>ms-0.7.1.tgz</b></p></summary>
<p>Tiny ms conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.1.tgz">https://registry.npmjs.org/ms/-/ms-0.7.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- debug-2.2.0.tgz (Root Library)
- :x: **ms-0.7.1.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability, which was classified as problematic, has been found in vercel ms up to 1.x. This issue affects the function parse of the file index.js. The manipulation of the argument str leads to inefficient regular expression complexity. The attack may be initiated remotely. The exploit has been disclosed to the public and may be used. Upgrading to version 2.0.0 is able to address this issue. The name of the patch is caae2988ba2a37765d055c4eee63d383320ee662. It is recommended to upgrade the affected component. The associated identifier of this vulnerability is VDB-217451.
<p>Publish Date: 2023-01-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-20162>CVE-2017-20162</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2023-01-05</p>
<p>Fix Resolution (ms): 2.0.0</p>
<p>Direct dependency fix Resolution (serve-favicon): 2.4.3</p><p>Fix Resolution (ms): 2.0.0</p>
<p>Direct dependency fix Resolution (debug): 2.6.7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-20162 (Medium) detected in ms-0.7.2.tgz, ms-0.7.1.tgz - ## CVE-2017-20162 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ms-0.7.2.tgz</b>, <b>ms-0.7.1.tgz</b></p></summary>
<p>
<details><summary><b>ms-0.7.2.tgz</b></p></summary>
<p>Tiny milisecond conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.2.tgz">https://registry.npmjs.org/ms/-/ms-0.7.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/serve-favicon/node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- serve-favicon-2.3.2.tgz (Root Library)
- :x: **ms-0.7.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>ms-0.7.1.tgz</b></p></summary>
<p>Tiny ms conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.1.tgz">https://registry.npmjs.org/ms/-/ms-0.7.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- debug-2.2.0.tgz (Root Library)
- :x: **ms-0.7.1.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability, which was classified as problematic, has been found in vercel ms up to 1.x. This issue affects the function parse of the file index.js. The manipulation of the argument str leads to inefficient regular expression complexity. The attack may be initiated remotely. The exploit has been disclosed to the public and may be used. Upgrading to version 2.0.0 is able to address this issue. The name of the patch is caae2988ba2a37765d055c4eee63d383320ee662. It is recommended to upgrade the affected component. The associated identifier of this vulnerability is VDB-217451.
<p>Publish Date: 2023-01-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-20162>CVE-2017-20162</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2023-01-05</p>
<p>Fix Resolution (ms): 2.0.0</p>
<p>Direct dependency fix Resolution (serve-favicon): 2.4.3</p><p>Fix Resolution (ms): 2.0.0</p>
<p>Direct dependency fix Resolution (debug): 2.6.7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in ms tgz ms tgz cve medium severity vulnerability vulnerable libraries ms tgz ms tgz ms tgz tiny milisecond conversion utility library home page a href path to dependency file package json path to vulnerable library node modules serve favicon node modules ms package json dependency hierarchy serve favicon tgz root library x ms tgz vulnerable library ms tgz tiny ms conversion utility library home page a href path to dependency file package json path to vulnerable library node modules ms package json dependency hierarchy debug tgz root library x ms tgz vulnerable library found in base branch master vulnerability details a vulnerability which was classified as problematic has been found in vercel ms up to x this issue affects the function parse of the file index js the manipulation of the argument str leads to inefficient regular expression complexity the attack may be initiated remotely the exploit has been disclosed to the public and may be used upgrading to version is able to address this issue the name of the patch is it is recommended to upgrade the affected component the associated identifier of this vulnerability is vdb publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution ms direct dependency fix resolution serve favicon fix resolution ms direct dependency fix resolution debug step up your open source security game with mend
| 0
|
279,611
| 24,238,808,477
|
IssuesEvent
|
2022-09-27 03:43:13
|
CeresDB/ceresdb
|
https://api.github.com/repos/CeresDB/ceresdb
|
closed
|
Implement `ignore` interceptor in integration test framework
|
enhancement A-test
|
**Description**
<!---
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
(This section helps Arrow developers understand the context and *why* for this feature, in addition to the *what*)
-->
Some cases are not expected to run due to unstable features, temporary changes or anything else. It should be supported to just ignore them.
**Proposal**
Implement the `ignore` interceptor. Proposed syntax:
```
-- CERESDB ignore: REASON
```
Follows the K-V format. Value for this interceptor is the reason why this case got ignored. It won't take effort and is a comment in fact.
<!---
Maybe you have considered some ideas or solutions about this feature.
-->
**Additional context**
<!---
Add any other context or screenshots about the feature request here.
-->
TBD: should we count/record those ignored cases and report it in the end?
ref #154
|
1.0
|
Implement `ignore` interceptor in integration test framework - **Description**
<!---
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
(This section helps Arrow developers understand the context and *why* for this feature, in addition to the *what*)
-->
Some cases are not expected to run due to unstable features, temporary changes or anything else. It should be supported to just ignore them.
**Proposal**
Implement the `ignore` interceptor. Proposed syntax:
```
-- CERESDB ignore: REASON
```
Follows the K-V format. Value for this interceptor is the reason why this case got ignored. It won't take effort and is a comment in fact.
<!---
Maybe you have considered some ideas or solutions about this feature.
-->
**Additional context**
<!---
Add any other context or screenshots about the feature request here.
-->
TBD: should we count/record those ignored cases and report it in the end?
ref #154
|
non_process
|
implement ignore interceptor in integration test framework description a clear and concise description of what the problem is ex i m always frustrated when this section helps arrow developers understand the context and why for this feature in addition to the what some cases are not expected to run due to unstable features temporary changes or anything else it should be supported to just ignore them proposal implement the ignore interceptor proposed syntax ceresdb ignore reason follows the k v format value for this interceptor is the reason why this case got ignored it won t take effort and is a comment in fact maybe you have considered some ideas or solutions about this feature additional context add any other context or screenshots about the feature request here tbd should we count record those ignored cases and report it in the end ref
| 0
|
15,803
| 19,989,397,575
|
IssuesEvent
|
2022-01-31 03:13:20
|
chellimiller/sassy-design-tokens
|
https://api.github.com/repos/chellimiller/sassy-design-tokens
|
opened
|
Set up GitHub packages
|
process
|
Set up repository to publish packages. Currently it's done via the command line which is less than ideal.
|
1.0
|
Set up GitHub packages - Set up repository to publish packages. Currently it's done via the command line which is less than ideal.
|
process
|
set up github packages set up repository to publish packages currently it s done via the command line which is less than ideal
| 1
|
12,684
| 3,285,248,061
|
IssuesEvent
|
2015-10-28 19:46:27
|
owncloud/client
|
https://api.github.com/repos/owncloud/client
|
reopened
|
Upgrade notification does not mention ownCloud
|
bug gold-ticket ReadyToTest
|
When I start the sync-client I receive a notification message:

could we add that the app who has the update is ownCloud client (or the branded name)
Other apps like Thunderbird or Firefox mention that they need attention.
@MorrisJobke @michaelstingl
|
1.0
|
Upgrade notification does not mention ownCloud - When I start the sync-client I receive a notification message:

could we add that the app who has the update is ownCloud client (or the branded name)
Other apps like Thunderbird or Firefox mention that they need attention.
@MorrisJobke @michaelstingl
|
non_process
|
upgrade notification does not mention owncloud when i start the sync client i receive a notification message could we add that the app who has the update is owncloud client or the branded name other apps like thunderbird or firefox mention that they need attention morrisjobke michaelstingl
| 0
|
69,927
| 3,316,311,295
|
IssuesEvent
|
2015-11-06 16:22:55
|
TeselaGen/ve
|
https://api.github.com/repos/TeselaGen/ve
|
closed
|
Sequence Provenance Tracking – keeping track of where DNA came from
|
Customer: DAS Phase I Priority: High Status: Active
|
This is a big one, and will require the inclusion of a system of provenance tracking so that any DNA sequence record created in the system will have parental identifiers that link it to its ancestors. Some characteristics of the provenance tracking system are:
* A record of the node (the sequence id) and the edge (parent-child link) is kept.
* Sequence records do not need to represent actual DNA stored in the lab (differentiation flag between real sequences and those that are only *in silico*)
* Must track all the parents from which the sequence was constructed
* A sequence can have no parents. In that case it becomes a top level node.
* The provenance graph must be an directed acyclic graph (DAG – no loops so your child can not be your parent)
See the [Requirements Document](https://docs.google.com/document/d/13ndQ5fuTFbORcBOmzbIjI_sKQiExsqmHvtOZWe2Bwgo/edit#) for more detail!
|
1.0
|
Sequence Provenance Tracking – keeping track of where DNA came from - This is a big one, and will require the inclusion of a system of provenance tracking so that any DNA sequence record created in the system will have parental identifiers that link it to its ancestors. Some characteristics of the provenance tracking system are:
* A record of the node (the sequence id) and the edge (parent-child link) is kept.
* Sequence records do not need to represent actual DNA stored in the lab (differentiation flag between real sequences and those that are only *in silico*)
* Must track all the parents from which the sequence was constructed
* A sequence can have no parents. In that case it becomes a top level node.
* The provenance graph must be an directed acyclic graph (DAG – no loops so your child can not be your parent)
See the [Requirements Document](https://docs.google.com/document/d/13ndQ5fuTFbORcBOmzbIjI_sKQiExsqmHvtOZWe2Bwgo/edit#) for more detail!
|
non_process
|
sequence provenance tracking – keeping track of where dna came from this is a big one and will require the inclusion of a system of provenance tracking so that any dna sequence record created in the system will have parental identifiers that link it to its ancestors some characteristics of the provenance tracking system are a record of the node the sequence id and the edge parent child link is kept sequence records do not need to represent actual dna stored in the lab differentiation flag between real sequences and those that are only in silico must track all the parents from which the sequence was constructed a sequence can have no parents in that case it becomes a top level node the provenance graph must be an directed acyclic graph dag – no loops so your child can not be your parent see the for more detail
| 0
|
19,050
| 13,187,242,995
|
IssuesEvent
|
2020-08-13 02:48:00
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
stable combo automatic advance (Trac #1781)
|
Incomplete Migration Migrated from Trac infrastructure task
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1781">https://code.icecube.wisc.edu/ticket/1781</a>, reported by david.schultz and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:38",
"description": "If we want to promote stable combo as a thing, it should automatically update whenever the buildbots are green. This seems like something a script could do fairly easily.",
"reporter": "david.schultz",
"cc": "olivas",
"resolution": "wontfix",
"_ts": "1550067158057333",
"component": "infrastructure",
"summary": "stable combo automatic advance",
"priority": "normal",
"keywords": "",
"time": "2016-07-18T20:56:44",
"milestone": "",
"owner": "nega",
"type": "task"
}
```
</p>
</details>
|
1.0
|
stable combo automatic advance (Trac #1781) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1781">https://code.icecube.wisc.edu/ticket/1781</a>, reported by david.schultz and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:38",
"description": "If we want to promote stable combo as a thing, it should automatically update whenever the buildbots are green. This seems like something a script could do fairly easily.",
"reporter": "david.schultz",
"cc": "olivas",
"resolution": "wontfix",
"_ts": "1550067158057333",
"component": "infrastructure",
"summary": "stable combo automatic advance",
"priority": "normal",
"keywords": "",
"time": "2016-07-18T20:56:44",
"milestone": "",
"owner": "nega",
"type": "task"
}
```
</p>
</details>
|
non_process
|
stable combo automatic advance trac migrated from json status closed changetime description if we want to promote stable combo as a thing it should automatically update whenever the buildbots are green this seems like something a script could do fairly easily reporter david schultz cc olivas resolution wontfix ts component infrastructure summary stable combo automatic advance priority normal keywords time milestone owner nega type task
| 0
|
149,211
| 19,566,844,593
|
IssuesEvent
|
2022-01-04 02:31:46
|
fasttrack-solutions/jQuery-QueryBuilder
|
https://api.github.com/repos/fasttrack-solutions/jQuery-QueryBuilder
|
opened
|
WS-2019-0605 (Medium) detected in CSS::Sassv3.6.0
|
security vulnerability
|
## WS-2019-0605 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>CSS::Sassv3.6.0</b></p></summary>
<p>
<p>Library home page: <a href=https://metacpan.org/pod/CSS::Sass>https://metacpan.org/pod/CSS::Sass</a></p>
<p>Found in base branch: <b>dev</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/node_modules/node-sass/src/libsass/src/lexer.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In sass versions between 3.2.0 to 3.6.3 may read 1 byte outside an allocated buffer while parsing a specially crafted css rule.
<p>Publish Date: 2019-07-16
<p>URL: <a href=https://github.com/sass/libsass/commit/7a21c79e321927363a153dc5d7e9c492365faf9b>WS-2019-0605</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/OSV-2020-734">https://osv.dev/vulnerability/OSV-2020-734</a></p>
<p>Release Date: 2019-07-16</p>
<p>Fix Resolution: 3.6.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0605 (Medium) detected in CSS::Sassv3.6.0 - ## WS-2019-0605 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>CSS::Sassv3.6.0</b></p></summary>
<p>
<p>Library home page: <a href=https://metacpan.org/pod/CSS::Sass>https://metacpan.org/pod/CSS::Sass</a></p>
<p>Found in base branch: <b>dev</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/node_modules/node-sass/src/libsass/src/lexer.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In sass versions between 3.2.0 to 3.6.3 may read 1 byte outside an allocated buffer while parsing a specially crafted css rule.
<p>Publish Date: 2019-07-16
<p>URL: <a href=https://github.com/sass/libsass/commit/7a21c79e321927363a153dc5d7e9c492365faf9b>WS-2019-0605</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/OSV-2020-734">https://osv.dev/vulnerability/OSV-2020-734</a></p>
<p>Release Date: 2019-07-16</p>
<p>Fix Resolution: 3.6.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in css ws medium severity vulnerability vulnerable library css library home page a href found in base branch dev vulnerable source files node modules node sass src libsass src lexer cpp vulnerability details in sass versions between to may read byte outside an allocated buffer while parsing a specially crafted css rule publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
8,722
| 11,859,818,908
|
IssuesEvent
|
2020-03-25 13:59:10
|
fraction/oasis
|
https://api.github.com/repos/fraction/oasis
|
closed
|
Version/commit question in issue template
|
good first issue process
|
## What's the problem you want solved?
Knowing the version/commit is useful for some debugging, but the majority of our issues target master.
## Is there a solution you'd like to recommend?
Remove it or make it optional, maybe? My intuition is that the bigger the template is, the fewer people will follow it, so I'd like to make sure that each question we ask has the same amount of importance.
## What version or commit of Oasis are you using?
latest :grimacing:
|
1.0
|
Version/commit question in issue template - ## What's the problem you want solved?
Knowing the version/commit is useful for some debugging, but the majority of our issues target master.
## Is there a solution you'd like to recommend?
Remove it or make it optional, maybe? My intuition is that the bigger the template is, the fewer people will follow it, so I'd like to make sure that each question we ask has the same amount of importance.
## What version or commit of Oasis are you using?
latest :grimacing:
|
process
|
version commit question in issue template what s the problem you want solved knowing the version commit is useful for some debugging but the majority of our issues target master is there a solution you d like to recommend remove it or make it optional maybe my intuition is that the bigger the template is the fewer people will follow it so i d like to make sure that each question we ask has the same amount of importance what version or commit of oasis are you using latest grimacing
| 1
|
9,554
| 12,515,794,718
|
IssuesEvent
|
2020-06-03 08:19:04
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
Severe display issue on some portrait images (generated cache issue from darktable)
|
bug: pending difficulty: hard priority: high scope: image processing
|
**Describe the bug**
Portrait images could sometimes be shown with pixel offset display. This result in strange image. See 2 examples below:
**To Reproduce**
A way to reproduce (but not reproducible all times) is to display images in culling mode (I use always 2 images at the same time) and after displaying landscapes images, showing new portrait images could result on this artefact.
But, I can reproduce near all the times this issue by rotate to the left a portrait image in lighttable (so with left rotate button in selected images module on the right panel.
This issue has been discussed with @AlicVB and @aurelienpierre yesterday and could be related to a Gdk issue when rotating the image. What's strange is I never reproduce the issue if rotating on the right.
By discussing with @AlicVB, we find that's probably be a cache generating issue has all portrait images displayed this way have cache images with the issue.
**Expected behavior**
Always have images displayed correctly on portrait ratio.
**Screenshots**


And a generated jpg picked up on cache (mipmap 3 here but could be picked up same way on all generated mipmaps cache):

**Platform (please complete the following information):**
- Darktable Version: one of the last master
- No OpenCL compatible card so CPU work only here.
|
1.0
|
Severe display issue on some portrait images (generated cache issue from darktable) - **Describe the bug**
Portrait images could sometimes be shown with pixel offset display. This result in strange image. See 2 examples below:
**To Reproduce**
A way to reproduce (but not reproducible all times) is to display images in culling mode (I use always 2 images at the same time) and after displaying landscapes images, showing new portrait images could result on this artefact.
But, I can reproduce near all the times this issue by rotate to the left a portrait image in lighttable (so with left rotate button in selected images module on the right panel.
This issue has been discussed with @AlicVB and @aurelienpierre yesterday and could be related to a Gdk issue when rotating the image. What's strange is I never reproduce the issue if rotating on the right.
By discussing with @AlicVB, we find that's probably be a cache generating issue has all portrait images displayed this way have cache images with the issue.
**Expected behavior**
Always have images displayed correctly on portrait ratio.
**Screenshots**


And a generated jpg picked up on cache (mipmap 3 here but could be picked up same way on all generated mipmaps cache):

**Platform (please complete the following information):**
- Darktable Version: one of the last master
- No OpenCL compatible card so CPU work only here.
|
process
|
severe display issue on some portrait images generated cache issue from darktable describe the bug portrait images could sometimes be shown with pixel offset display this result in strange image see examples below to reproduce a way to reproduce but not reproducible all times is to display images in culling mode i use always images at the same time and after displaying landscapes images showing new portrait images could result on this artefact but i can reproduce near all the times this issue by rotate to the left a portrait image in lighttable so with left rotate button in selected images module on the right panel this issue has been discussed with alicvb and aurelienpierre yesterday and could be related to a gdk issue when rotating the image what s strange is i never reproduce the issue if rotating on the right by discussing with alicvb we find that s probably be a cache generating issue has all portrait images displayed this way have cache images with the issue expected behavior always have images displayed correctly on portrait ratio screenshots and a generated jpg picked up on cache mipmap here but could be picked up same way on all generated mipmaps cache platform please complete the following information darktable version one of the last master no opencl compatible card so cpu work only here
| 1
|
6,326
| 9,359,588,424
|
IssuesEvent
|
2019-04-02 07:19:33
|
aiidateam/aiida_core
|
https://api.github.com/repos/aiidateam/aiida_core
|
opened
|
Formalize the mapping between nested process port namespaces to flat link labels
|
priority/critical-blocking requires discussion topic/naming-issues topic/processes topic/provenance type/bug
|
The `Process` and `ProcessNode` duality forces us to define a mapping between the nested port namespaces of the process and the flat link label hierarchy of the corresponding node in the provenance graph. A process can define a nested port namespace through its spec:
```
class NestedProcess(Process):
@classmethod
def define(cls, spec):
super(NestedProcess, cls).define(spec)
spec.input('name.spaced.input')
spec.output('name.spaced.output')
```
If we print the inputs port namespace, we get the following:
```
In [6]: print(NestedProcess.spec().inputs)
{
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"name": {
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"spaced": {
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"input": {
"name": "input",
"non_db": "False",
"required": "True"
}
}
}
}
```
When constructing this process, or any process, the inputs are expected to map exactly onto this namespace.
So for example, `inputs = {'name': {'spaced': {'input': 1}}}` would be valid.
The same principles go for outputs and they should be registered in such a way that it maps one-to-one on the port namespace.
For the duration of the process, the inputs and outputs stay organized in this nested hierarchy.
However, finally the process needs to be represented in the provenance graph by a node, and the inputs and outputs should be linked up.
The link label, which is the only thing that can be used to represent the namespacing, is a string and therefore necessarily one-dimensional.
The nested namespace therefore needs to be mapped onto this flat space.
Currently, the namespaces are just concatenated with underscores.
So the input node in the example would get the link label `name_spaced_input`.
However, now it becomes indistinguishable from an input that was passed to a process with the name `name_spaced_input` in the top level namespace.
We should clearly define the rules for the mapping between nested namespaces and flat link labels.
This should also keep in mind that if there is any risk of overlapping labels when expanding namespaces.
For example, in the current situation, what would happen for the following input port namespace:
```
class NestedProcess(Process):
@classmethod
def define(cls, spec):
super(NestedProcess, cls).define(spec)
spec.input('name.spaced.input')
spec.input('name_spaced_output')
```
This is currently legal as far as the process spec is concerned.
The first is a doubly nested namespace and the latter is a single port in the top level namespace.
However, the link labels that will be generated will overlap, making it impossible to link both inputs to the node.
|
1.0
|
Formalize the mapping between nested process port namespaces to flat link labels - The `Process` and `ProcessNode` duality forces us to define a mapping between the nested port namespaces of the process and the flat link label hierarchy of the corresponding node in the provenance graph. A process can define a nested port namespace through its spec:
```
class NestedProcess(Process):
@classmethod
def define(cls, spec):
super(NestedProcess, cls).define(spec)
spec.input('name.spaced.input')
spec.output('name.spaced.output')
```
If we print the inputs port namespace, we get the following:
```
In [6]: print(NestedProcess.spec().inputs)
{
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"name": {
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"spaced": {
"_attrs": {
"default": [],
"dynamic": false,
"help": null,
"required": "True",
"valid_type": "None"
},
"input": {
"name": "input",
"non_db": "False",
"required": "True"
}
}
}
}
```
When constructing this process, or any process, the inputs are expected to map exactly onto this namespace.
So for example, `inputs = {'name': {'spaced': {'input': 1}}}` would be valid.
The same principles go for outputs and they should be registered in such a way that it maps one-to-one on the port namespace.
For the duration of the process, the inputs and outputs stay organized in this nested hierarchy.
However, finally the process needs to be represented in the provenance graph by a node, and the inputs and outputs should be linked up.
The link label, which is the only thing that can be used to represent the namespacing, is a string and therefore necessarily one-dimensional.
The nested namespace therefore needs to be mapped onto this flat space.
Currently, the namespaces are just concatenated with underscores.
So the input node in the example would get the link label `name_spaced_input`.
However, now it becomes indistinguishable from an input that was passed to a process with the name `name_spaced_input` in the top level namespace.
We should clearly define the rules for the mapping between nested namespaces and flat link labels.
This should also keep in mind that if there is any risk of overlapping labels when expanding namespaces.
For example, in the current situation, what would happen for the following input port namespace:
```
class NestedProcess(Process):
@classmethod
def define(cls, spec):
super(NestedProcess, cls).define(spec)
spec.input('name.spaced.input')
spec.input('name_spaced_output')
```
This is currently legal as far as the process spec is concerned.
The first is a doubly nested namespace and the latter is a single port in the top level namespace.
However, the link labels that will be generated will overlap, making it impossible to link both inputs to the node.
|
process
|
formalize the mapping between nested process port namespaces to flat link labels the process and processnode duality forces us to define a mapping between the nested port namespaces of the process and the flat link label hierarchy of the corresponding node in the provenance graph a process can define a nested port namespace through its spec class nestedprocess process classmethod def define cls spec super nestedprocess cls define spec spec input name spaced input spec output name spaced output if we print the inputs port namespace we get the following in print nestedprocess spec inputs attrs default dynamic false help null required true valid type none name attrs default dynamic false help null required true valid type none spaced attrs default dynamic false help null required true valid type none input name input non db false required true when constructing this process or any process the inputs are expected to map exactly onto this namespace so for example inputs name spaced input would be valid the same principles go for outputs and they should be registered in such a way that it maps one to one on the port namespace for the duration of the process the inputs and outputs stay organized in this nested hierarchy however finally the process needs to be represented in the provenance graph by a node and the inputs and outputs should be linked up the link label which is the only thing that can be used to represent the namespacing is a string and therefore necessarily one dimensional the nested namespace therefore needs to be mapped onto this flat space currently the namespaces are just concatenated with underscores so the input node in the example would get the link label name spaced input however now it becomes indistinguishable from an input that was passed to a process with the name name spaced input in the top level namespace we should clearly define the rules for the mapping between nested namespaces and flat link labels this should also keep in mind that if there is any risk of overlapping labels when expanding namespaces for example in the current situation what would happen for the following input port namespace class nestedprocess process classmethod def define cls spec super nestedprocess cls define spec spec input name spaced input spec input name spaced output this is currently legal as far as the process spec is concerned the first is a doubly nested namespace and the latter is a single port in the top level namespace however the link labels that will be generated will overlap making it impossible to link both inputs to the node
| 1
|
10,615
| 13,439,001,317
|
IssuesEvent
|
2020-09-07 19:43:23
|
timberio/vector
|
https://api.github.com/repos/timberio/vector
|
opened
|
New `strip` remap function
|
domain: mapping domain: processing type: feature
|
The `strip` remap function strips leading and trailing whitespace
## Example
Given this event:
```js
{
"message": "\t\tThis string has whitespace around it "
}
```
And this remap instruction set:
```
.message = strip(.message)
```
Would result in:
```js
{
"message": "This string has whitespace around it"
}
```
## Requirements
- [ ] Strips leading _and_ trailing whitespace.
- [ ] Splits on all whitespace as defined by [Unicode whitespace character](https://en.wikipedia.org/wiki/Unicode_character_property#Whitespace).
|
1.0
|
New `strip` remap function - The `strip` remap function strips leading and trailing whitespace
## Example
Given this event:
```js
{
"message": "\t\tThis string has whitespace around it "
}
```
And this remap instruction set:
```
.message = strip(.message)
```
Would result in:
```js
{
"message": "This string has whitespace around it"
}
```
## Requirements
- [ ] Strips leading _and_ trailing whitespace.
- [ ] Splits on all whitespace as defined by [Unicode whitespace character](https://en.wikipedia.org/wiki/Unicode_character_property#Whitespace).
|
process
|
new strip remap function the strip remap function strips leading and trailing whitespace example given this event js message t tthis string has whitespace around it and this remap instruction set message strip message would result in js message this string has whitespace around it requirements strips leading and trailing whitespace splits on all whitespace as defined by
| 1
|
13,191
| 15,614,048,913
|
IssuesEvent
|
2021-03-19 17:14:50
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
How to reference a VM in an Environment?
|
Pri2 devops-cicd-process/tech devops/prod doc-enhancement
|
The page on registering VMs into Environments speaks about the registration but nowhere do I see any examples of how my commands being run against my environment are supposed to reference the "current" VM they are supposed to run against. This sounds like some vital missing information.
If I register N virtual machines in an Environment, I expect to be able to reference them in my steps. In fact, I expect to be able to execute my steps on those virtual machines. At the moment, how to do this and whether this is even possible remains unclear.
The feature is very vaguely described, so it is possible I misunderstand how it is meant to be used and what I say above makes no sense. In that case, the feature should be described more clearly.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 91d0d31f-81ee-c024-db7e-daddbf525f71
* Version Independent ID: 330f1649-386c-d0aa-5f96-b8343a1480d3
* Content: [Environment - Virtual machine resource - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops)
* Content Source: [docs/pipelines/process/environments-virtual-machines.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/environments-virtual-machines.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
How to reference a VM in an Environment? - The page on registering VMs into Environments speaks about the registration but nowhere do I see any examples of how my commands being run against my environment are supposed to reference the "current" VM they are supposed to run against. This sounds like some vital missing information.
If I register N virtual machines in an Environment, I expect to be able to reference them in my steps. In fact, I expect to be able to execute my steps on those virtual machines. At the moment, how to do this and whether this is even possible remains unclear.
The feature is very vaguely described, so it is possible I misunderstand how it is meant to be used and what I say above makes no sense. In that case, the feature should be described more clearly.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 91d0d31f-81ee-c024-db7e-daddbf525f71
* Version Independent ID: 330f1649-386c-d0aa-5f96-b8343a1480d3
* Content: [Environment - Virtual machine resource - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops)
* Content Source: [docs/pipelines/process/environments-virtual-machines.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/environments-virtual-machines.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
how to reference a vm in an environment the page on registering vms into environments speaks about the registration but nowhere do i see any examples of how my commands being run against my environment are supposed to reference the current vm they are supposed to run against this sounds like some vital missing information if i register n virtual machines in an environment i expect to be able to reference them in my steps in fact i expect to be able to execute my steps on those virtual machines at the moment how to do this and whether this is even possible remains unclear the feature is very vaguely described so it is possible i misunderstand how it is meant to be used and what i say above makes no sense in that case the feature should be described more clearly document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
15,530
| 19,703,295,102
|
IssuesEvent
|
2022-01-12 18:54:12
|
googleapis/python-cloud-common
|
https://api.github.com/repos/googleapis/python-cloud-common
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* api_shortname field missing from .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* api_shortname field missing from .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname field missing from repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
672,106
| 22,787,598,709
|
IssuesEvent
|
2022-07-09 13:57:12
|
cleverage/responsive-video-background
|
https://api.github.com/repos/cleverage/responsive-video-background
|
opened
|
Use IntersectionObserver to play the video only when visible in the viewport?
|
type: enhancement 🧗♂️ priority: medium 🟡 feature: WebPerf 🏎 feature: UX (User eXperience) 🤹♂️
|
Would it better for responsiveness of the page, CPU, RAM, batteries?
|
1.0
|
Use IntersectionObserver to play the video only when visible in the viewport? - Would it better for responsiveness of the page, CPU, RAM, batteries?
|
non_process
|
use intersectionobserver to play the video only when visible in the viewport would it better for responsiveness of the page cpu ram batteries
| 0
|
5,200
| 7,974,440,583
|
IssuesEvent
|
2018-07-17 05:32:44
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
ParserError reports incorrect location
|
bug parse-tree-processing status-deferred user-interface
|
As per #1660 when the ParseError is inspected, the line and column number appear to be relative to the raw text file (which would seemingly include the VB Attributes). Could it be that Line/Column values are out of Sync in the Declarations section?
|
1.0
|
ParserError reports incorrect location - As per #1660 when the ParseError is inspected, the line and column number appear to be relative to the raw text file (which would seemingly include the VB Attributes). Could it be that Line/Column values are out of Sync in the Declarations section?
|
process
|
parsererror reports incorrect location as per when the parseerror is inspected the line and column number appear to be relative to the raw text file which would seemingly include the vb attributes could it be that line column values are out of sync in the declarations section
| 1
|
302,707
| 9,285,506,004
|
IssuesEvent
|
2019-03-21 07:26:18
|
kowala-tech/kcoin
|
https://api.github.com/repos/kowala-tech/kcoin
|
closed
|
Put back -race flag
|
low priority
|
We had to disable temporarily the `-race` flag in the tests because there's an issue with alpine linux. Would be good to bring it back
ref: a3b2bd42259b46698510be33cc2ec78103ae2639
|
1.0
|
Put back -race flag - We had to disable temporarily the `-race` flag in the tests because there's an issue with alpine linux. Would be good to bring it back
ref: a3b2bd42259b46698510be33cc2ec78103ae2639
|
non_process
|
put back race flag we had to disable temporarily the race flag in the tests because there s an issue with alpine linux would be good to bring it back ref
| 0
|
9,504
| 12,492,544,245
|
IssuesEvent
|
2020-06-01 07:25:59
|
lutraconsulting/qgis-crayfish-plugin
|
https://api.github.com/repos/lutraconsulting/qgis-crayfish-plugin
|
closed
|
Export vectors to points
|
enhancement feedback processing
|
a processing alg to export vector components to points:
- a column containing magnitude
- a column containing direction
- an option for alg for the re-sampling of the output points
|
1.0
|
Export vectors to points - a processing alg to export vector components to points:
- a column containing magnitude
- a column containing direction
- an option for alg for the re-sampling of the output points
|
process
|
export vectors to points a processing alg to export vector components to points a column containing magnitude a column containing direction an option for alg for the re sampling of the output points
| 1
|
438,891
| 30,667,853,888
|
IssuesEvent
|
2023-07-25 19:45:38
|
onnx/onnx-mlir
|
https://api.github.com/repos/onnx/onnx-mlir
|
opened
|
SupportedONNXOps-cpu.md missing unsupported attributes
|
documentation
|
Resize at https://github.com/onnx/onnx-mlir/blob/main/docs/SupportedONNXOps-cpu.md mentions a few unsupported modes but looking at the code, it appears only specific modes are supported. Documentation should be updated.
For example, based on the code I don't think `align_corners` would be valid for onnx-mlir but it exists in the ONNX Resize Op (https://github.com/onnx/onnx/blob/main/docs/Operators.md#Resize)
|
1.0
|
SupportedONNXOps-cpu.md missing unsupported attributes - Resize at https://github.com/onnx/onnx-mlir/blob/main/docs/SupportedONNXOps-cpu.md mentions a few unsupported modes but looking at the code, it appears only specific modes are supported. Documentation should be updated.
For example, based on the code I don't think `align_corners` would be valid for onnx-mlir but it exists in the ONNX Resize Op (https://github.com/onnx/onnx/blob/main/docs/Operators.md#Resize)
|
non_process
|
supportedonnxops cpu md missing unsupported attributes resize at mentions a few unsupported modes but looking at the code it appears only specific modes are supported documentation should be updated for example based on the code i don t think align corners would be valid for onnx mlir but it exists in the onnx resize op
| 0
|
8,399
| 11,567,219,779
|
IssuesEvent
|
2020-02-20 13:57:09
|
prisma/prisma-engines
|
https://api.github.com/repos/prisma/prisma-engines
|
opened
|
Introduce consistent version output amongst engines
|
kind/improvement process/candidate
|
The output of the version command is inconsistent right now:
```
./prisma --version
prisma HASH
./introspection-engine --version
HASH
./migration-engine --version
HASH
```
We should either always print the engine name or never. For the frontend use-cases not printing the engine name would be sufficient.
|
1.0
|
Introduce consistent version output amongst engines - The output of the version command is inconsistent right now:
```
./prisma --version
prisma HASH
./introspection-engine --version
HASH
./migration-engine --version
HASH
```
We should either always print the engine name or never. For the frontend use-cases not printing the engine name would be sufficient.
|
process
|
introduce consistent version output amongst engines the output of the version command is inconsistent right now prisma version prisma hash introspection engine version hash migration engine version hash we should either always print the engine name or never for the frontend use cases not printing the engine name would be sufficient
| 1
|
24,970
| 11,134,909,421
|
IssuesEvent
|
2019-12-20 13:06:25
|
BytecodeAgency/Coding-Standards
|
https://api.github.com/repos/BytecodeAgency/Coding-Standards
|
closed
|
WS-2015-0049 (Medium) detected in marked-0.3.19.js
|
security vulnerability
|
## WS-2015-0049 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>marked-0.3.19.js</b></p></summary>
<p>A markdown parser built for speed</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/marked/0.3.19/marked.js">https://cdnjs.cloudflare.com/ajax/libs/marked/0.3.19/marked.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/Coding-Standards/files/node_modules/marked/www/demo.html</p>
<p>Path to vulnerable library: /Coding-Standards/files/node_modules/marked/www/../lib/marked.js,/Coding-Standards/node_modules/marked/www/../lib/marked.js</p>
<p>
Dependency Hierarchy:
- :x: **marked-0.3.19.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/BytecodeAgency/Coding-Standards/commit/84fc0e4dc596dbe18c0add7b24b7c0e37366468d">84fc0e4dc596dbe18c0add7b24b7c0e37366468d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions 0.3.2 and earlier of marked are affected by a cross-site scripting vulnerability even when sanitize:true is set.
<p>Publish Date: 2019-03-17
<p>URL: <a href=https://github.com/markedjs/marked/commit/fc372d1c6293267722e33f2719d57cebd67b3da1>WS-2015-0049</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/24/versions">https://www.npmjs.com/advisories/24/versions</a></p>
<p>Release Date: 2019-03-17</p>
<p>Fix Resolution: 0.3.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2015-0049 (Medium) detected in marked-0.3.19.js - ## WS-2015-0049 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>marked-0.3.19.js</b></p></summary>
<p>A markdown parser built for speed</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/marked/0.3.19/marked.js">https://cdnjs.cloudflare.com/ajax/libs/marked/0.3.19/marked.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/Coding-Standards/files/node_modules/marked/www/demo.html</p>
<p>Path to vulnerable library: /Coding-Standards/files/node_modules/marked/www/../lib/marked.js,/Coding-Standards/node_modules/marked/www/../lib/marked.js</p>
<p>
Dependency Hierarchy:
- :x: **marked-0.3.19.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/BytecodeAgency/Coding-Standards/commit/84fc0e4dc596dbe18c0add7b24b7c0e37366468d">84fc0e4dc596dbe18c0add7b24b7c0e37366468d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions 0.3.2 and earlier of marked are affected by a cross-site scripting vulnerability even when sanitize:true is set.
<p>Publish Date: 2019-03-17
<p>URL: <a href=https://github.com/markedjs/marked/commit/fc372d1c6293267722e33f2719d57cebd67b3da1>WS-2015-0049</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/24/versions">https://www.npmjs.com/advisories/24/versions</a></p>
<p>Release Date: 2019-03-17</p>
<p>Fix Resolution: 0.3.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in marked js ws medium severity vulnerability vulnerable library marked js a markdown parser built for speed library home page a href path to dependency file tmp ws scm coding standards files node modules marked www demo html path to vulnerable library coding standards files node modules marked www lib marked js coding standards node modules marked www lib marked js dependency hierarchy x marked js vulnerable library found in head commit a href vulnerability details versions and earlier of marked are affected by a cross site scripting vulnerability even when sanitize true is set publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
18,594
| 24,569,989,823
|
IssuesEvent
|
2022-10-13 07:53:10
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] App is crashing in the below scenario
|
Bug Blocker P0 Android Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Install the android app and signup
2. Enroll into the study
3. Navigate to resources
4. Open any pdf resources and lock the phone once after pdf resouce is loaded
5. Now, go back to SB
6. Publish resources from the SB
7. Unlock the mobile device now
8. And navigate from Resources to Dashboard and again to Study activities screen
9. Refresh the screen and Verify
**AR:** App is crashing in the above scenario.
**ER:** App should not crash in the above scenario and participants should remain in the study activities screen only.
**Note:**
App is restarting in some of the devices and navigating to study list screen for the above scenario
|
3.0
|
[Android] App is crashing in the below scenario - **Steps:**
1. Install the android app and signup
2. Enroll into the study
3. Navigate to resources
4. Open any pdf resources and lock the phone once after pdf resouce is loaded
5. Now, go back to SB
6. Publish resources from the SB
7. Unlock the mobile device now
8. And navigate from Resources to Dashboard and again to Study activities screen
9. Refresh the screen and Verify
**AR:** App is crashing in the above scenario.
**ER:** App should not crash in the above scenario and participants should remain in the study activities screen only.
**Note:**
App is restarting in some of the devices and navigating to study list screen for the above scenario
|
process
|
app is crashing in the below scenario steps install the android app and signup enroll into the study navigate to resources open any pdf resources and lock the phone once after pdf resouce is loaded now go back to sb publish resources from the sb unlock the mobile device now and navigate from resources to dashboard and again to study activities screen refresh the screen and verify ar app is crashing in the above scenario er app should not crash in the above scenario and participants should remain in the study activities screen only note app is restarting in some of the devices and navigating to study list screen for the above scenario
| 1
|
14,186
| 17,069,994,467
|
IssuesEvent
|
2021-07-07 12:11:57
|
buddyboss/buddyboss-platform
|
https://api.github.com/repos/buddyboss/buddyboss-platform
|
opened
|
WordPress 8.0 Compatibility
|
feature: enhancement integration: compatible
|
Check Platform compatibility with WordPress v5.8
All features: https://wordpress.org/news/2021/07/the-month-in-wordpress-june-2021/
Theme.json : https://make.wordpress.org/test/2021/06/24/call-for-testing-thrive-with-theme-json/
IF we see issues specific to WP 8.0 then please note all issues here for someone to fix.
|
True
|
WordPress 8.0 Compatibility - Check Platform compatibility with WordPress v5.8
All features: https://wordpress.org/news/2021/07/the-month-in-wordpress-june-2021/
Theme.json : https://make.wordpress.org/test/2021/06/24/call-for-testing-thrive-with-theme-json/
IF we see issues specific to WP 8.0 then please note all issues here for someone to fix.
|
non_process
|
wordpress compatibility check platform compatibility with wordpress all features theme json if we see issues specific to wp then please note all issues here for someone to fix
| 0
|
179,861
| 14,724,351,781
|
IssuesEvent
|
2021-01-06 02:22:11
|
hapipal/hecks
|
https://api.github.com/repos/hapipal/hecks
|
opened
|
Add installation section to readme
|
documentation
|
Hecks' readme isn't consistent with the new readme format because it is missing an "installation" section. See toys as an example: https://github.com/hapipal/toys
|
1.0
|
Add installation section to readme - Hecks' readme isn't consistent with the new readme format because it is missing an "installation" section. See toys as an example: https://github.com/hapipal/toys
|
non_process
|
add installation section to readme hecks readme isn t consistent with the new readme format because it is missing an installation section see toys as an example
| 0
|
17,306
| 23,122,844,963
|
IssuesEvent
|
2022-07-28 00:18:23
|
mdsreq-fga-unb/2022.1-Meio-a-Meio
|
https://api.github.com/repos/mdsreq-fga-unb/2022.1-Meio-a-Meio
|
closed
|
Processo e procedimentos - Disciplinas
|
Processo de Desenvolvimento
|
**Descrição**
1. entrega de Definir a arquitetura: **Documento** -> que documento é esse?
2. Backlog sprint - não é entrega de Construção
|
1.0
|
Processo e procedimentos - Disciplinas - **Descrição**
1. entrega de Definir a arquitetura: **Documento** -> que documento é esse?
2. Backlog sprint - não é entrega de Construção
|
process
|
processo e procedimentos disciplinas descrição entrega de definir a arquitetura documento que documento é esse backlog sprint não é entrega de construção
| 1
|
2,483
| 5,256,871,368
|
IssuesEvent
|
2017-02-02 19:03:22
|
GoogleCloudPlatform/google-cloud-java
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-java
|
opened
|
Convert from Maven to Gradle
|
release process
|
Reasons:
1. We can migrate the release process out of bash and into build scripts
2. We can migrate the snippet update process out of python and into build scripts
3. The rest of the team's Java repositories are in Gradle (`toolkit`, `gax-java`)
|
1.0
|
Convert from Maven to Gradle - Reasons:
1. We can migrate the release process out of bash and into build scripts
2. We can migrate the snippet update process out of python and into build scripts
3. The rest of the team's Java repositories are in Gradle (`toolkit`, `gax-java`)
|
process
|
convert from maven to gradle reasons we can migrate the release process out of bash and into build scripts we can migrate the snippet update process out of python and into build scripts the rest of the team s java repositories are in gradle toolkit gax java
| 1
|
102,286
| 16,558,787,646
|
IssuesEvent
|
2021-05-28 16:59:24
|
snowdensb/OrchardCore
|
https://api.github.com/repos/snowdensb/OrchardCore
|
opened
|
CVE-2018-20677 (Medium) detected in bootstrap-3.3.7.min.js
|
security vulnerability
|
## CVE-2018-20677 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: OrchardCore/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html</p>
<p>Path to vulnerable library: OrchardCore/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/OrchardCore/commit/9ad2ef7c6d196ad393ed0ee0dd62fe3ec96b4c9e">9ad2ef7c6d196ad393ed0ee0dd62fe3ec96b4c9e</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677>CVE-2018-20677</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.7","packageFilePaths":["/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html"],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0"}],"baseBranches":["dev"],"vulnerabilityIdentifier":"CVE-2018-20677","vulnerabilityDetails":"In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-20677 (Medium) detected in bootstrap-3.3.7.min.js - ## CVE-2018-20677 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: OrchardCore/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html</p>
<p>Path to vulnerable library: OrchardCore/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/OrchardCore/commit/9ad2ef7c6d196ad393ed0ee0dd62fe3ec96b4c9e">9ad2ef7c6d196ad393ed0ee0dd62fe3ec96b4c9e</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677>CVE-2018-20677</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.7","packageFilePaths":["/src/OrchardCore.Modules/OrchardCore.AdminMenu/node_modules/fontawesome-iconpicker/index.html"],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0"}],"baseBranches":["dev"],"vulnerabilityIdentifier":"CVE-2018-20677","vulnerabilityDetails":"In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file orchardcore src orchardcore modules orchardcore adminmenu node modules fontawesome iconpicker index html path to vulnerable library orchardcore src orchardcore modules orchardcore adminmenu node modules fontawesome iconpicker index html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch dev vulnerability details in bootstrap before xss is possible in the affix configuration target property publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap nordron angulartemplate dynamic net express projecttemplates dotnetng template znxtapp core module theme beta jmeter isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion bootstrap nordron angulartemplate dynamic net express projecttemplates dotnetng template znxtapp core module theme beta jmeter basebranches vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the affix configuration target property vulnerabilityurl
| 0
|
5,986
| 8,805,374,266
|
IssuesEvent
|
2018-12-26 19:13:52
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
index paths incorrect when @copy-to topicrefs are in nested maps
|
bug preprocess priority/medium stale
|
A bookmap in a top-level folder contains maprefs to maps stored in child folders. These submaps contain topics `@copy-to` in some topicrefs. When the bookmap is built, the index link paths to the `@copy-to` topicrefs are incorrect. The `@copy-to` files are generated in the correct child folders location(s), but the index links indicate that the files are at the top level of the output directory.
|
1.0
|
index paths incorrect when @copy-to topicrefs are in nested maps - A bookmap in a top-level folder contains maprefs to maps stored in child folders. These submaps contain topics `@copy-to` in some topicrefs. When the bookmap is built, the index link paths to the `@copy-to` topicrefs are incorrect. The `@copy-to` files are generated in the correct child folders location(s), but the index links indicate that the files are at the top level of the output directory.
|
process
|
index paths incorrect when copy to topicrefs are in nested maps a bookmap in a top level folder contains maprefs to maps stored in child folders these submaps contain topics copy to in some topicrefs when the bookmap is built the index link paths to the copy to topicrefs are incorrect the copy to files are generated in the correct child folders location s but the index links indicate that the files are at the top level of the output directory
| 1
|
22,270
| 30,821,947,079
|
IssuesEvent
|
2023-08-01 16:59:47
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
closed
|
INTAKE EPIC | Profile | Update Claim Status to Display Pending Claim for a Disability Rating
|
Epic authenticated-experience profile intake-process-ae
|
## Background
We know the disability rating is very important to veterans. When they have a claim pending in order to obtain/maintain that disability rating we may want to consider adding that to the nametag.
```[tasklist]
### Tasks
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/54047
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/60248
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/60489
```
|
1.0
|
INTAKE EPIC | Profile | Update Claim Status to Display Pending Claim for a Disability Rating - ## Background
We know the disability rating is very important to veterans. When they have a claim pending in order to obtain/maintain that disability rating we may want to consider adding that to the nametag.
```[tasklist]
### Tasks
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/54047
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/60248
- [ ] https://github.com/department-of-veterans-affairs/va.gov-team/issues/60489
```
|
process
|
intake epic profile update claim status to display pending claim for a disability rating background we know the disability rating is very important to veterans when they have a claim pending in order to obtain maintain that disability rating we may want to consider adding that to the nametag tasks
| 1
|
56,152
| 6,963,906,819
|
IssuesEvent
|
2017-12-08 19:20:48
|
adamdriscoll/poshprotools
|
https://api.github.com/repos/adamdriscoll/poshprotools
|
closed
|
WinForms - Font Error
|
bug forms-designer
|
After some manually changes in form designer.ps1 in powershell-ise get my next error.
Loading in VS adding extra stuff and the fontsize is not changing and get error later in powershell
Two lines edited manually to set background to button of end of file are ok and no error from powershell or VS. ( $label1.Font was never changed manually only set earlier VS forms this project/files besides those lines are on begining of file).
In powershell:
`New-Object : Cannot convert argument "2", with value: "25", for "Font" to type "System.Drawing.FontStyle": "Cannot convert value "25" to type "System.Drawing.FontStyle" due to enumeration va
lues that are not valid. Specify one of the following enumeration values and try again. The possible enumeration values are "Regular;Bold;Italic;Underline;Strikeout"."
At C:\Users\test\source\repos\PowerShellProject8\PowerShellProject8\Form1.designer.ps1:38 char:17
+ ... el2.Font = (New-Object -TypeName System.Drawing.Font -ArgumentList @( ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [New-Object], MethodException
+ FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.PowerShell.Commands.NewObjectCommand`
In VS no errors:
In Forms Design - the font size 11 bold.
in debuggin option - the font size and thickness is default - no errors in console.
|
1.0
|
WinForms - Font Error - After some manually changes in form designer.ps1 in powershell-ise get my next error.
Loading in VS adding extra stuff and the fontsize is not changing and get error later in powershell
Two lines edited manually to set background to button of end of file are ok and no error from powershell or VS. ( $label1.Font was never changed manually only set earlier VS forms this project/files besides those lines are on begining of file).
In powershell:
`New-Object : Cannot convert argument "2", with value: "25", for "Font" to type "System.Drawing.FontStyle": "Cannot convert value "25" to type "System.Drawing.FontStyle" due to enumeration va
lues that are not valid. Specify one of the following enumeration values and try again. The possible enumeration values are "Regular;Bold;Italic;Underline;Strikeout"."
At C:\Users\test\source\repos\PowerShellProject8\PowerShellProject8\Form1.designer.ps1:38 char:17
+ ... el2.Font = (New-Object -TypeName System.Drawing.Font -ArgumentList @( ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [New-Object], MethodException
+ FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.PowerShell.Commands.NewObjectCommand`
In VS no errors:
In Forms Design - the font size 11 bold.
in debuggin option - the font size and thickness is default - no errors in console.
|
non_process
|
winforms font error after some manually changes in form designer in powershell ise get my next error loading in vs adding extra stuff and the fontsize is not changing and get error later in powershell two lines edited manually to set background to button of end of file are ok and no error from powershell or vs font was never changed manually only set earlier vs forms this project files besides those lines are on begining of file in powershell new object cannot convert argument with value for font to type system drawing fontstyle cannot convert value to type system drawing fontstyle due to enumeration va lues that are not valid specify one of the following enumeration values and try again the possible enumeration values are regular bold italic underline strikeout at c users test source repos designer char font new object typename system drawing font argumentlist categoryinfo invalidoperation methodexception fullyqualifiederrorid constructorinvokedthrowexception microsoft powershell commands newobjectcommand in vs no errors in forms design the font size bold in debuggin option the font size and thickness is default no errors in console
| 0
|
9,869
| 12,882,221,724
|
IssuesEvent
|
2020-07-12 15:54:03
|
kubeflow/kubeflow
|
https://api.github.com/repos/kubeflow/kubeflow
|
closed
|
Ready 1.0 Blog and Webinar decks
|
area/docs kind/feature kind/process lifecycle/stale priority/p0
|
/kind feature
**Why you need this feature:**
Following Kubeflow release process, we need a 1.0 announcement blog post and a 1.0 Webinar deck for the upcoming release.
**Describe the solution you'd like:**
Here are the relevant drafts that authors should review:
- [Blog](https://docs.google.com/document/d/11h3OohwrQtRKtDoH1_0e2tMM_rFh5PBBKJRJTgTfjqg/edit?ts=5e3211c0)
- [Webinar Deck](https://docs.google.com/presentation/d/1B0fwMypAyLBrHlyHIc6zdBk9TDGKZKcbc5XKpr8cVfM/edit#slide=id.g6e6cfc68dd_2_27)
If you are one of the authors and unable to access, please request access.
|
1.0
|
Ready 1.0 Blog and Webinar decks - /kind feature
**Why you need this feature:**
Following Kubeflow release process, we need a 1.0 announcement blog post and a 1.0 Webinar deck for the upcoming release.
**Describe the solution you'd like:**
Here are the relevant drafts that authors should review:
- [Blog](https://docs.google.com/document/d/11h3OohwrQtRKtDoH1_0e2tMM_rFh5PBBKJRJTgTfjqg/edit?ts=5e3211c0)
- [Webinar Deck](https://docs.google.com/presentation/d/1B0fwMypAyLBrHlyHIc6zdBk9TDGKZKcbc5XKpr8cVfM/edit#slide=id.g6e6cfc68dd_2_27)
If you are one of the authors and unable to access, please request access.
|
process
|
ready blog and webinar decks kind feature why you need this feature following kubeflow release process we need a announcement blog post and a webinar deck for the upcoming release describe the solution you d like here are the relevant drafts that authors should review if you are one of the authors and unable to access please request access
| 1
|
21,671
| 30,118,735,670
|
IssuesEvent
|
2023-06-30 13:36:49
|
NixOS/nix
|
https://api.github.com/repos/NixOS/nix
|
opened
|
Make sure we can fix the parser's bad list/number interactions
|
bug feature language process
|
**Is your feature request related to a problem? Please describe.**
We have to obscure syntaxes that can be argued to be bugs; certainly one of them.
However, changing the tokenizer/parser for them technically means breaking the language.
- #8605
- `[ 01.1 ] == [ 1 0.1 ]`
- #7695
- `[ 0xff ] == [ 0 xff ]` if `xff` is in scope
I think bug for bug compatibility is valuable, but also costly. In this case the cost/benefit of bug for bug compatibility seems off, and we may have a way out.
**Describe the solution you'd like**
Language versioning would be great, but going this route also means that we'd have to indefinitely support the bad syntax.
So regardless of the language versioning idea, we can run an experiment to verify that the community does not rely on the bad syntax, and then "break" the syntax to no ill effect.
Benefits:
- clean up the implementation; less maintenance overhead
- the default language version has better syntax
Method:
- reject the bad syntaxes in the lexer and/or parser. Ask users to open an issue in the error message.
- wait
- if nothing is reported, fix and improve the lexer and/or parser as appropriate depending on whether an issue was created
**Describe alternatives you've considered**
Wait for language versioning and maintain the bad syntax forever even if there's no benefit of doing so.
**Additional context**
**Priorities**
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
1.0
|
Make sure we can fix the parser's bad list/number interactions - **Is your feature request related to a problem? Please describe.**
We have to obscure syntaxes that can be argued to be bugs; certainly one of them.
However, changing the tokenizer/parser for them technically means breaking the language.
- #8605
- `[ 01.1 ] == [ 1 0.1 ]`
- #7695
- `[ 0xff ] == [ 0 xff ]` if `xff` is in scope
I think bug for bug compatibility is valuable, but also costly. In this case the cost/benefit of bug for bug compatibility seems off, and we may have a way out.
**Describe the solution you'd like**
Language versioning would be great, but going this route also means that we'd have to indefinitely support the bad syntax.
So regardless of the language versioning idea, we can run an experiment to verify that the community does not rely on the bad syntax, and then "break" the syntax to no ill effect.
Benefits:
- clean up the implementation; less maintenance overhead
- the default language version has better syntax
Method:
- reject the bad syntaxes in the lexer and/or parser. Ask users to open an issue in the error message.
- wait
- if nothing is reported, fix and improve the lexer and/or parser as appropriate depending on whether an issue was created
**Describe alternatives you've considered**
Wait for language versioning and maintain the bad syntax forever even if there's no benefit of doing so.
**Additional context**
**Priorities**
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
process
|
make sure we can fix the parser s bad list number interactions is your feature request related to a problem please describe we have to obscure syntaxes that can be argued to be bugs certainly one of them however changing the tokenizer parser for them technically means breaking the language if xff is in scope i think bug for bug compatibility is valuable but also costly in this case the cost benefit of bug for bug compatibility seems off and we may have a way out describe the solution you d like language versioning would be great but going this route also means that we d have to indefinitely support the bad syntax so regardless of the language versioning idea we can run an experiment to verify that the community does not rely on the bad syntax and then break the syntax to no ill effect benefits clean up the implementation less maintenance overhead the default language version has better syntax method reject the bad syntaxes in the lexer and or parser ask users to open an issue in the error message wait if nothing is reported fix and improve the lexer and or parser as appropriate depending on whether an issue was created describe alternatives you ve considered wait for language versioning and maintain the bad syntax forever even if there s no benefit of doing so additional context priorities add to
| 1
|
20,793
| 27,537,245,829
|
IssuesEvent
|
2023-03-07 05:05:33
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Link to page for creating user-assigned managed id is incorrect
|
automation/svc triaged cxp doc-enhancement process-automation/subsvc Pri1
|
The first point under 'Prerequisites' has links to create a system-assigned or user-assigned managed identity. The page linked to for creating a user-assigned managed identity has no instructions on creating a user-assigned managed identity.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 3eedb810-487f-bc9c-89f5-d5fdbcc5d796
* Version Independent ID: 329e9ec7-d9ea-518b-625b-d39880365e80
* Content: [Migrate from a Run As account to Managed identities](https://learn.microsoft.com/en-us/azure/automation/migrate-run-as-accounts-managed-identity?tabs=run-as-account)
* Content Source: [articles/automation/migrate-run-as-accounts-managed-identity.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/migrate-run-as-accounts-managed-identity.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
1.0
|
Link to page for creating user-assigned managed id is incorrect - The first point under 'Prerequisites' has links to create a system-assigned or user-assigned managed identity. The page linked to for creating a user-assigned managed identity has no instructions on creating a user-assigned managed identity.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 3eedb810-487f-bc9c-89f5-d5fdbcc5d796
* Version Independent ID: 329e9ec7-d9ea-518b-625b-d39880365e80
* Content: [Migrate from a Run As account to Managed identities](https://learn.microsoft.com/en-us/azure/automation/migrate-run-as-accounts-managed-identity?tabs=run-as-account)
* Content Source: [articles/automation/migrate-run-as-accounts-managed-identity.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/migrate-run-as-accounts-managed-identity.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
process
|
link to page for creating user assigned managed id is incorrect the first point under prerequisites has links to create a system assigned or user assigned managed identity the page linked to for creating a user assigned managed identity has no instructions on creating a user assigned managed identity document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehasudhirg microsoft alias sudhirsneha
| 1
|
452,010
| 32,047,823,472
|
IssuesEvent
|
2023-09-23 07:24:12
|
BoBoBaSs84/EF.Core.Database.Adapter
|
https://api.github.com/repos/BoBoBaSs84/EF.Core.Database.Adapter
|
closed
|
Refactoring: preferences model xml schema definition
|
documentation enhancement refactoring
|
- make it cleaner ..
- make use of `constants`
- define a `xml` schema
|
1.0
|
Refactoring: preferences model xml schema definition - - make it cleaner ..
- make use of `constants`
- define a `xml` schema
|
non_process
|
refactoring preferences model xml schema definition make it cleaner make use of constants define a xml schema
| 0
|
6,205
| 9,107,428,823
|
IssuesEvent
|
2019-02-21 04:21:48
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Getting a parse error with no result in search results window
|
bug parse-tree-processing
|
Version 2.2.6672.28001
OS: Microsoft Windows NT 10.0.16299.0, x64
Host Product: Microsoft Office 2013 x86
Host Version: 15.0.5045.1000
Host Executable: MSACCESS.EXE
This is persisting after fixing the error and getting a successful compile.
[RubberduckLog.txt](https://github.com/rubberduck-vba/Rubberduck/files/2506927/RubberduckLog.txt)
|
1.0
|
Getting a parse error with no result in search results window - Version 2.2.6672.28001
OS: Microsoft Windows NT 10.0.16299.0, x64
Host Product: Microsoft Office 2013 x86
Host Version: 15.0.5045.1000
Host Executable: MSACCESS.EXE
This is persisting after fixing the error and getting a successful compile.
[RubberduckLog.txt](https://github.com/rubberduck-vba/Rubberduck/files/2506927/RubberduckLog.txt)
|
process
|
getting a parse error with no result in search results window version os microsoft windows nt host product microsoft office host version host executable msaccess exe this is persisting after fixing the error and getting a successful compile
| 1
|
467,649
| 13,452,193,620
|
IssuesEvent
|
2020-09-08 21:39:29
|
unfoldingWord/tc-create-app
|
https://api.github.com/repos/unfoldingWord/tc-create-app
|
opened
|
"Sorry, no matching records found" message upon opening a TN file
|
Priority/Critical bug
|
Two Russian users are reporting that opening a TN tsv file returns a: "Sorry, no matching records found"

Attempts to fix:
Refresh
Logout/login
Changing browser to Chrome
Deleting branch.
|
1.0
|
"Sorry, no matching records found" message upon opening a TN file - Two Russian users are reporting that opening a TN tsv file returns a: "Sorry, no matching records found"

Attempts to fix:
Refresh
Logout/login
Changing browser to Chrome
Deleting branch.
|
non_process
|
sorry no matching records found message upon opening a tn file two russian users are reporting that opening a tn tsv file returns a sorry no matching records found attempts to fix refresh logout login changing browser to chrome deleting branch
| 0
|
9,719
| 12,716,634,355
|
IssuesEvent
|
2020-06-24 02:33:12
|
kubeflow/pipelines
|
https://api.github.com/repos/kubeflow/pipelines
|
closed
|
[Release] Changelog Process after 1.0
|
area/release kind/process priority/p0 status/triaged
|
## Background
The current changelog tool: https://github.com/github-changelog-generator/github-changelog-generator always queries all issues and PRs, it soon ran out of github quota even if a personal token is provided. Also it's no longer actively maintained. We need to investigate a new tool that suits our need.
## Requirements
Ideally, I think we have these requirements:
* works stably
* PR authors/maintainers can edit some metadata in PR to categorize PRs in changelog, e.g. we'd want different areas, and different types (bug, feature, misc).
* it can handle cherry picks
|
1.0
|
[Release] Changelog Process after 1.0 - ## Background
The current changelog tool: https://github.com/github-changelog-generator/github-changelog-generator always queries all issues and PRs, it soon ran out of github quota even if a personal token is provided. Also it's no longer actively maintained. We need to investigate a new tool that suits our need.
## Requirements
Ideally, I think we have these requirements:
* works stably
* PR authors/maintainers can edit some metadata in PR to categorize PRs in changelog, e.g. we'd want different areas, and different types (bug, feature, misc).
* it can handle cherry picks
|
process
|
changelog process after background the current changelog tool always queries all issues and prs it soon ran out of github quota even if a personal token is provided also it s no longer actively maintained we need to investigate a new tool that suits our need requirements ideally i think we have these requirements works stably pr authors maintainers can edit some metadata in pr to categorize prs in changelog e g we d want different areas and different types bug feature misc it can handle cherry picks
| 1
|
20,570
| 27,230,007,142
|
IssuesEvent
|
2023-02-21 12:32:56
|
corona-warn-app/cwa-wishlist
|
https://api.github.com/repos/corona-warn-app/cwa-wishlist
|
closed
|
Hospital QR-Codes for enabling in-hospital positive tested users to warn their contacts via the Corona-Warn-App
|
feature request mirrored-to-jira Test/Share process
|
<details>
<summary>Prologue - Skip if not interested</summary>
The Twitter account [@Anaesthet1](https://twitter.com/Anaesthet1) with more than 20.000 followers has tweeted about his experience with the Corona-Warn-App:
https://twitter.com/Anaesthet1/status/1430504822101454850?s=20
The Tweet received over 800 likes and over 40 comments, some from users who are also sharing their experience.
In the Tweet he1 says that he/she: "As a physician in the emergency department and ICU who has constant contact with secured infected patients, I have had a red alert exactly ONE time because ONE time a positive tested patient entered this into the app." (Translated from German using DeepL)
The official Twitter account of the Cororna-Warn-App ([@coronawarnapp](https://twitter.com/coronawarnapp)) [reacted](https://twitter.com/coronawarnapp/status/1430936110843273216?s=20) on this Tweet and making clear that the Corona-Warn-App does not warn users who are in contact with already positive tested users but warns users who had contact with a later positive tested user in the last 14 days.
I reached out to [@Anaesthet1](https://twitter.com/Anaesthet1) and while talking to him/her it [became clear](https://twitter.com/Anaesthet1/status/1430554639397789706?s=20) that he/she also has contact to not yet positive tested people who are then tested positively once in the hospital.
I then [asked](https://twitter.com/EinTim2/status/1430555840663199752?s=20) whether people who are tested in hospitals (who maybe came in with an ambulance) really get a QR-Code to receive their test result via CWA, [@Anaesthet1](https://twitter.com/Anaesthet1) [answered](https://twitter.com/Anaesthet1/status/1430556360446562310?s=20) that this depends on the condition the person is in. But he/she also said the issuing of CWA test result QR codes and lack of knowledge about the option to warn contacts with a TAN is also a problem.
This got me and that's why I open this issue.
</details>
## Feature description
Hospitals should have an easy, safe way to hand out QR-codes if a user was tested positive. In many hospitals, the current workflow is that an employee takes the swab from a person arriving in the hospital and then waiting for 15 minutes (in case of a RAT) or sending the probe to the laboratory and receiving the result. Then an employee walks to the tested person and informs them about their test result. If the result is negativ, everything is fine, go ahead. But if the result is positive it would be a really great addition if hospital staff could easily create QR-Codes for the Corona-Warn-App. These QR-Codes could be handed over with the information of the positive test result and a little flyer / information text why the person should warn their contacts via the Corona-Warn-App.
## Problem and motivation
The problem is that in the most hospitals the test procedure is not connected to the Corona-Warn-App system at all, thus CWA users who were tested positive in a hospital often have only one option to warn their contacts: Call the hotline and get a TeleTAN.
This is really not very practical and probably nothing you (or I) would do when diagnosed with COVID in a hospital. People have other things in mind then calling some hotline to get a row of letters and numbers to enter it into an app.
But if it would be possible for the user to just quickly scan a QR code in this app directly after the employee told them their positive test result **and** informing them about the possibility to warn others via the Corona-Warn-App, then more users would do it and more users would get warned.
---
Internal Tracking ID: [EXPOSUREAPP-13530](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-13530)
|
1.0
|
Hospital QR-Codes for enabling in-hospital positive tested users to warn their contacts via the Corona-Warn-App - <details>
<summary>Prologue - Skip if not interested</summary>
The Twitter account [@Anaesthet1](https://twitter.com/Anaesthet1) with more than 20.000 followers has tweeted about his experience with the Corona-Warn-App:
https://twitter.com/Anaesthet1/status/1430504822101454850?s=20
The Tweet received over 800 likes and over 40 comments, some from users who are also sharing their experience.
In the Tweet he1 says that he/she: "As a physician in the emergency department and ICU who has constant contact with secured infected patients, I have had a red alert exactly ONE time because ONE time a positive tested patient entered this into the app." (Translated from German using DeepL)
The official Twitter account of the Cororna-Warn-App ([@coronawarnapp](https://twitter.com/coronawarnapp)) [reacted](https://twitter.com/coronawarnapp/status/1430936110843273216?s=20) on this Tweet and making clear that the Corona-Warn-App does not warn users who are in contact with already positive tested users but warns users who had contact with a later positive tested user in the last 14 days.
I reached out to [@Anaesthet1](https://twitter.com/Anaesthet1) and while talking to him/her it [became clear](https://twitter.com/Anaesthet1/status/1430554639397789706?s=20) that he/she also has contact to not yet positive tested people who are then tested positively once in the hospital.
I then [asked](https://twitter.com/EinTim2/status/1430555840663199752?s=20) whether people who are tested in hospitals (who maybe came in with an ambulance) really get a QR-Code to receive their test result via CWA, [@Anaesthet1](https://twitter.com/Anaesthet1) [answered](https://twitter.com/Anaesthet1/status/1430556360446562310?s=20) that this depends on the condition the person is in. But he/she also said the issuing of CWA test result QR codes and lack of knowledge about the option to warn contacts with a TAN is also a problem.
This got me and that's why I open this issue.
</details>
## Feature description
Hospitals should have an easy, safe way to hand out QR-codes if a user was tested positive. In many hospitals, the current workflow is that an employee takes the swab from a person arriving in the hospital and then waiting for 15 minutes (in case of a RAT) or sending the probe to the laboratory and receiving the result. Then an employee walks to the tested person and informs them about their test result. If the result is negativ, everything is fine, go ahead. But if the result is positive it would be a really great addition if hospital staff could easily create QR-Codes for the Corona-Warn-App. These QR-Codes could be handed over with the information of the positive test result and a little flyer / information text why the person should warn their contacts via the Corona-Warn-App.
## Problem and motivation
The problem is that in the most hospitals the test procedure is not connected to the Corona-Warn-App system at all, thus CWA users who were tested positive in a hospital often have only one option to warn their contacts: Call the hotline and get a TeleTAN.
This is really not very practical and probably nothing you (or I) would do when diagnosed with COVID in a hospital. People have other things in mind then calling some hotline to get a row of letters and numbers to enter it into an app.
But if it would be possible for the user to just quickly scan a QR code in this app directly after the employee told them their positive test result **and** informing them about the possibility to warn others via the Corona-Warn-App, then more users would do it and more users would get warned.
---
Internal Tracking ID: [EXPOSUREAPP-13530](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-13530)
|
process
|
hospital qr codes for enabling in hospital positive tested users to warn their contacts via the corona warn app prologue skip if not interested the twitter account with more than followers has tweeted about his experience with the corona warn app the tweet received over likes and over comments some from users who are also sharing their experience in the tweet says that he she as a physician in the emergency department and icu who has constant contact with secured infected patients i have had a red alert exactly one time because one time a positive tested patient entered this into the app translated from german using deepl the official twitter account of the cororna warn app on this tweet and making clear that the corona warn app does not warn users who are in contact with already positive tested users but warns users who had contact with a later positive tested user in the last days i reached out to and while talking to him her it that he she also has contact to not yet positive tested people who are then tested positively once in the hospital i then whether people who are tested in hospitals who maybe came in with an ambulance really get a qr code to receive their test result via cwa that this depends on the condition the person is in but he she also said the issuing of cwa test result qr codes and lack of knowledge about the option to warn contacts with a tan is also a problem this got me and that s why i open this issue feature description hospitals should have an easy safe way to hand out qr codes if a user was tested positive in many hospitals the current workflow is that an employee takes the swab from a person arriving in the hospital and then waiting for minutes in case of a rat or sending the probe to the laboratory and receiving the result then an employee walks to the tested person and informs them about their test result if the result is negativ everything is fine go ahead but if the result is positive it would be a really great addition if hospital staff could easily create qr codes for the corona warn app these qr codes could be handed over with the information of the positive test result and a little flyer information text why the person should warn their contacts via the corona warn app problem and motivation the problem is that in the most hospitals the test procedure is not connected to the corona warn app system at all thus cwa users who were tested positive in a hospital often have only one option to warn their contacts call the hotline and get a teletan this is really not very practical and probably nothing you or i would do when diagnosed with covid in a hospital people have other things in mind then calling some hotline to get a row of letters and numbers to enter it into an app but if it would be possible for the user to just quickly scan a qr code in this app directly after the employee told them their positive test result and informing them about the possibility to warn others via the corona warn app then more users would do it and more users would get warned internal tracking id
| 1
|
17,994
| 24,011,940,448
|
IssuesEvent
|
2022-09-14 19:40:48
|
hashgraph/hedera-json-rpc-relay
|
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
|
closed
|
Split Acceptance test actions jobs
|
enhancement P2 process
|
### Problem
Acceptance tests now include multiple suites.
This increases run time but also means intermittent failures require rerun of unaffected test cases
### Solution
Split Acceptance test actions into separate jobs
### Alternatives
_No response_
|
1.0
|
Split Acceptance test actions jobs - ### Problem
Acceptance tests now include multiple suites.
This increases run time but also means intermittent failures require rerun of unaffected test cases
### Solution
Split Acceptance test actions into separate jobs
### Alternatives
_No response_
|
process
|
split acceptance test actions jobs problem acceptance tests now include multiple suites this increases run time but also means intermittent failures require rerun of unaffected test cases solution split acceptance test actions into separate jobs alternatives no response
| 1
|
122,200
| 17,695,345,493
|
IssuesEvent
|
2021-08-24 14:44:07
|
CDCgov/prime-reportstream
|
https://api.github.com/repos/CDCgov/prime-reportstream
|
opened
|
Clean up old VNETs
|
DevOps Blocked security
|
## Problem statement
The Palo Alto VNETs are up and running. Now let's clean up the old VNETs.
## What you need to know
[Links to documents, workflows, or list of key people and contact info]
- [Document 1](https://...)
- [Document 2](https://...)
- ...
## Acceptance criteria
- [ ] ...
## To do
- [ ] ...
|
True
|
Clean up old VNETs - ## Problem statement
The Palo Alto VNETs are up and running. Now let's clean up the old VNETs.
## What you need to know
[Links to documents, workflows, or list of key people and contact info]
- [Document 1](https://...)
- [Document 2](https://...)
- ...
## Acceptance criteria
- [ ] ...
## To do
- [ ] ...
|
non_process
|
clean up old vnets problem statement the palo alto vnets are up and running now let s clean up the old vnets what you need to know acceptance criteria to do
| 0
|
347,430
| 10,430,074,495
|
IssuesEvent
|
2019-09-17 05:32:07
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[master-preview] Host a world (and load loacal world) doesn't work
|
Fixed High Priority QA Regression
|
Steps:
1. Launch client (no server is working)
2. Press "Host a Wprld"
3. Put name and press start
Expected:
Local world is created and you are in the game
Actual:
World is created but you can't connect.
[log.txt](https://github.com/StrangeLoopGames/EcoIssues/files/3514406/log.txt)
|
1.0
|
[master-preview] Host a world (and load loacal world) doesn't work - Steps:
1. Launch client (no server is working)
2. Press "Host a Wprld"
3. Put name and press start
Expected:
Local world is created and you are in the game
Actual:
World is created but you can't connect.
[log.txt](https://github.com/StrangeLoopGames/EcoIssues/files/3514406/log.txt)
|
non_process
|
host a world and load loacal world doesn t work steps launch client no server is working press host a wprld put name and press start expected local world is created and you are in the game actual world is created but you can t connect
| 0
|
19,976
| 26,457,318,823
|
IssuesEvent
|
2023-01-16 15:09:46
|
pystatgen/sgkit
|
https://api.github.com/repos/pystatgen/sgkit
|
closed
|
Disable numba caching via environment variable
|
process + tools
|
Edit: related to #371
I've recently started experimenting with sgkit on a [SLURM cluster](https://jobqueue.dask.org/en/latest/generated/dask_jobqueue.SLURMCluster.html) which is working well with the exception of methods using `guvectorize` with `cache=True`. Calling these functions results in a segmentation fault on the worker. This only seems to be an issue with `guvectorize` (not the `jit` or `vectorize` decorators) and there is no segmentation fault if I set `cache=False`.
There are a couple of open issues that may be related although neither quite match what I'm seeing (need to dig some more):
- https://github.com/dask/distributed/issues/3450
- https://github.com/numba/numba/issues/4807
There is also an open issue for globally disabling numba caching which would provide a workaround although it might be stale:
- https://github.com/numba/numba/issues/4549
In the meantime, for the sake of debugging and workarounds, it'd be useful to be able to disable numba-caching in sgkit using an environment variable.
|
1.0
|
Disable numba caching via environment variable - Edit: related to #371
I've recently started experimenting with sgkit on a [SLURM cluster](https://jobqueue.dask.org/en/latest/generated/dask_jobqueue.SLURMCluster.html) which is working well with the exception of methods using `guvectorize` with `cache=True`. Calling these functions results in a segmentation fault on the worker. This only seems to be an issue with `guvectorize` (not the `jit` or `vectorize` decorators) and there is no segmentation fault if I set `cache=False`.
There are a couple of open issues that may be related although neither quite match what I'm seeing (need to dig some more):
- https://github.com/dask/distributed/issues/3450
- https://github.com/numba/numba/issues/4807
There is also an open issue for globally disabling numba caching which would provide a workaround although it might be stale:
- https://github.com/numba/numba/issues/4549
In the meantime, for the sake of debugging and workarounds, it'd be useful to be able to disable numba-caching in sgkit using an environment variable.
|
process
|
disable numba caching via environment variable edit related to i ve recently started experimenting with sgkit on a which is working well with the exception of methods using guvectorize with cache true calling these functions results in a segmentation fault on the worker this only seems to be an issue with guvectorize not the jit or vectorize decorators and there is no segmentation fault if i set cache false there are a couple of open issues that may be related although neither quite match what i m seeing need to dig some more there is also an open issue for globally disabling numba caching which would provide a workaround although it might be stale in the meantime for the sake of debugging and workarounds it d be useful to be able to disable numba caching in sgkit using an environment variable
| 1
|
624,014
| 19,684,692,635
|
IssuesEvent
|
2022-01-11 20:38:36
|
loveology/design
|
https://api.github.com/repos/loveology/design
|
closed
|
Dashboard comps/Wireframes for businesses (Movement, K-Love)
|
enhancement Medium Priority
|
The content will be the same, but we need a front facing portal for those businesses.
|
1.0
|
Dashboard comps/Wireframes for businesses (Movement, K-Love) - The content will be the same, but we need a front facing portal for those businesses.
|
non_process
|
dashboard comps wireframes for businesses movement k love the content will be the same but we need a front facing portal for those businesses
| 0
|
94,951
| 8,527,197,272
|
IssuesEvent
|
2018-11-02 18:41:10
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
Backport: Pipeline crash causing "boot loop" on rancher/rancher:latest when pushing to ECR
|
area/pipeline kind/bug status/resolved status/to-test version/2.0
|
Backport for https://github.com/rancher/rancher/issues/16187
|
1.0
|
Backport: Pipeline crash causing "boot loop" on rancher/rancher:latest when pushing to ECR - Backport for https://github.com/rancher/rancher/issues/16187
|
non_process
|
backport pipeline crash causing boot loop on rancher rancher latest when pushing to ecr backport for
| 0
|
189,466
| 6,797,861,978
|
IssuesEvent
|
2017-11-02 01:30:54
|
segmentio/evergreen
|
https://api.github.com/repos/segmentio/evergreen
|
closed
|
evergreen-tooltip outline
|
Priority: Medium Status: Proposal Type: New Package
|
`evergreen-tooltip` is a package exporting a Tooltip React component. Tooltips display floating content in relation to a target. Tooltip appear either at the top, bottom, left or right of their target. The preferred and default side is the bottom. Maybe Tooltips use smart positioning if there is not enough space on the preferred side.
Tooltips use a similar implementation to Popovers, maybe there will be a shared package that both of them implement.
When creating a popover, you must specify both:
* its content, by setting the content prop, and
* its target, as a single child element or a function
Tooltips implement `onMouseLeave` and `onMouseEnter`.
Tooltips should be dark by default and implement something like `colors.neutral['900']` with opacity.
|
1.0
|
evergreen-tooltip outline - `evergreen-tooltip` is a package exporting a Tooltip React component. Tooltips display floating content in relation to a target. Tooltip appear either at the top, bottom, left or right of their target. The preferred and default side is the bottom. Maybe Tooltips use smart positioning if there is not enough space on the preferred side.
Tooltips use a similar implementation to Popovers, maybe there will be a shared package that both of them implement.
When creating a popover, you must specify both:
* its content, by setting the content prop, and
* its target, as a single child element or a function
Tooltips implement `onMouseLeave` and `onMouseEnter`.
Tooltips should be dark by default and implement something like `colors.neutral['900']` with opacity.
|
non_process
|
evergreen tooltip outline evergreen tooltip is a package exporting a tooltip react component tooltips display floating content in relation to a target tooltip appear either at the top bottom left or right of their target the preferred and default side is the bottom maybe tooltips use smart positioning if there is not enough space on the preferred side tooltips use a similar implementation to popovers maybe there will be a shared package that both of them implement when creating a popover you must specify both its content by setting the content prop and its target as a single child element or a function tooltips implement onmouseleave and onmouseenter tooltips should be dark by default and implement something like colors neutral with opacity
| 0
|
451,166
| 13,025,801,666
|
IssuesEvent
|
2020-07-27 14:05:12
|
jenkins-x/jx
|
https://api.github.com/repos/jenkins-x/jx
|
closed
|
jx fails to import a project because of a 63 character limitation of metadata.labels.
|
area/activity kind/bug lifecycle/rotten priority/important-soon
|
### Summary
While importing a *Gitlab* project, `jx import` will fail to import the project with this error:
```
error: failed to start pipeline build: unable to apply Tekton CRDs: failed to apply Tekton CRDs: PipelineActivity.jenkins.io "jenkins-x-test-import-jx-import-4-master-1" is invalid: metadata.labels: Invalid value: "git-gitlab-<companyname>-com-jenkins-x-test-import-jx-import-4-git": must be no more than 63 characters
```
This looks like its related to this issue : https://github.com/jenkins-x/jx/issues/4370
Is there a way to truncate this label ? This prevents us from importing our projects.
### Steps to reproduce the behavior
Run `jx import` on a project that hits the 63 character limit. The project does not have to be especially long (yes, this is relative, but its going to depend on how long the base + group path is) , especially in cases where there are nested groups.
### Expected behavior
`jx import` doesn't fail to import projects because of their path length.
### Actual behavior
`jx import` fails because of the path length
### Jx version
The output of `jx version` is:
```
NAME VERSION
jx 2.0.1200
Kubernetes cluster v1.14.9-eks-c0eccc
kubectl v1.15.10
helm client Client: v2.14.3+g0e7f3b6
git 2.20.1 (Apple Git-117)
Operating System Mac OS X 10.14.5 build 18F203
verifying packages
```
### Jenkins type
<!--
Select which installation type are you using.
-->
- [X ] Serverless Jenkins X Pipelines (Tekton + Prow)
- [ ] Classic Jenkins
### Kubernetes cluster
<!--
TF deployed EKS cluster
-->
### Operating system / Environment
<!--
OSX
-->
|
1.0
|
jx fails to import a project because of a 63 character limitation of metadata.labels. - ### Summary
While importing a *Gitlab* project, `jx import` will fail to import the project with this error:
```
error: failed to start pipeline build: unable to apply Tekton CRDs: failed to apply Tekton CRDs: PipelineActivity.jenkins.io "jenkins-x-test-import-jx-import-4-master-1" is invalid: metadata.labels: Invalid value: "git-gitlab-<companyname>-com-jenkins-x-test-import-jx-import-4-git": must be no more than 63 characters
```
This looks like its related to this issue : https://github.com/jenkins-x/jx/issues/4370
Is there a way to truncate this label ? This prevents us from importing our projects.
### Steps to reproduce the behavior
Run `jx import` on a project that hits the 63 character limit. The project does not have to be especially long (yes, this is relative, but its going to depend on how long the base + group path is) , especially in cases where there are nested groups.
### Expected behavior
`jx import` doesn't fail to import projects because of their path length.
### Actual behavior
`jx import` fails because of the path length
### Jx version
The output of `jx version` is:
```
NAME VERSION
jx 2.0.1200
Kubernetes cluster v1.14.9-eks-c0eccc
kubectl v1.15.10
helm client Client: v2.14.3+g0e7f3b6
git 2.20.1 (Apple Git-117)
Operating System Mac OS X 10.14.5 build 18F203
verifying packages
```
### Jenkins type
<!--
Select which installation type are you using.
-->
- [X ] Serverless Jenkins X Pipelines (Tekton + Prow)
- [ ] Classic Jenkins
### Kubernetes cluster
<!--
TF deployed EKS cluster
-->
### Operating system / Environment
<!--
OSX
-->
|
non_process
|
jx fails to import a project because of a character limitation of metadata labels summary while importing a gitlab project jx import will fail to import the project with this error error failed to start pipeline build unable to apply tekton crds failed to apply tekton crds pipelineactivity jenkins io jenkins x test import jx import master is invalid metadata labels invalid value git gitlab com jenkins x test import jx import git must be no more than characters this looks like its related to this issue is there a way to truncate this label this prevents us from importing our projects steps to reproduce the behavior run jx import on a project that hits the character limit the project does not have to be especially long yes this is relative but its going to depend on how long the base group path is especially in cases where there are nested groups expected behavior jx import doesn t fail to import projects because of their path length actual behavior jx import fails because of the path length jx version the output of jx version is name version jx kubernetes cluster eks kubectl helm client client git apple git operating system mac os x build verifying packages jenkins type select which installation type are you using serverless jenkins x pipelines tekton prow classic jenkins kubernetes cluster tf deployed eks cluster operating system environment osx
| 0
|
165,402
| 6,275,945,389
|
IssuesEvent
|
2017-07-18 08:22:53
|
BinPar/PRM
|
https://api.github.com/repos/BinPar/PRM
|
opened
|
PRM UNI : LOS PROMOTORES SOLICITAN QUE EL COLOR DE LA FUENTE TENGA MÁS INTENSIDAD
|
Priority: Medium
|
Los promotores (y me incluyo) solicitan las siguientes mejoras en la visualización del PRM:
- dar más intensidad o color a las fuentes. Son excesivamente ténues y produce molestias en los ojos. Han probado ajustando el brillo y contraste de la pantalla, pero sin resultado positivo
- mostrar la LUPA también en el lateral derecho de la pantalla de CONTACTOS. Dada la dificultad de desplazamiento horizontal en las pantallas, es necesario que la lupa aparezca no solo en el lateral derecho, sino también en el izquierdo para agilizar el acceso al contacto deseado. Esto mismo se aplicaría a la pantalla de contactos de VD

En la imagen se ve que la barra de desplazamiento horizontal no aparece y la lupa, a la que se accede desplazando la barra, tampoco.
La opción para visualizarlo todo pasa por reducir el tamaño lo que dificulta enormemente la lectura.
@CristianBinpar @minigoBinpar
|
1.0
|
PRM UNI : LOS PROMOTORES SOLICITAN QUE EL COLOR DE LA FUENTE TENGA MÁS INTENSIDAD - Los promotores (y me incluyo) solicitan las siguientes mejoras en la visualización del PRM:
- dar más intensidad o color a las fuentes. Son excesivamente ténues y produce molestias en los ojos. Han probado ajustando el brillo y contraste de la pantalla, pero sin resultado positivo
- mostrar la LUPA también en el lateral derecho de la pantalla de CONTACTOS. Dada la dificultad de desplazamiento horizontal en las pantallas, es necesario que la lupa aparezca no solo en el lateral derecho, sino también en el izquierdo para agilizar el acceso al contacto deseado. Esto mismo se aplicaría a la pantalla de contactos de VD

En la imagen se ve que la barra de desplazamiento horizontal no aparece y la lupa, a la que se accede desplazando la barra, tampoco.
La opción para visualizarlo todo pasa por reducir el tamaño lo que dificulta enormemente la lectura.
@CristianBinpar @minigoBinpar
|
non_process
|
prm uni los promotores solicitan que el color de la fuente tenga más intensidad los promotores y me incluyo solicitan las siguientes mejoras en la visualización del prm dar más intensidad o color a las fuentes son excesivamente ténues y produce molestias en los ojos han probado ajustando el brillo y contraste de la pantalla pero sin resultado positivo mostrar la lupa también en el lateral derecho de la pantalla de contactos dada la dificultad de desplazamiento horizontal en las pantallas es necesario que la lupa aparezca no solo en el lateral derecho sino también en el izquierdo para agilizar el acceso al contacto deseado esto mismo se aplicaría a la pantalla de contactos de vd en la imagen se ve que la barra de desplazamiento horizontal no aparece y la lupa a la que se accede desplazando la barra tampoco la opción para visualizarlo todo pasa por reducir el tamaño lo que dificulta enormemente la lectura cristianbinpar minigobinpar
| 0
|
12,732
| 15,100,695,762
|
IssuesEvent
|
2021-02-08 06:03:39
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
reopened
|
Test Failure: System.Diagnostics.Tests.ProcessTests/LongProcessNamesAreSupported
|
area-System.Diagnostics.Process test-run-core
|
Test **System.Diagnostics.Tests.ProcessTests/LongProcessNamesAreSupported** has failed.
Message :
```
Assert.Contains() Failure
Not found: (filter expression)
In value: Process[] [System.Diagnostics.Process (sh), System.Diagnostics.Process (sh), System.Diagnostics.Process (bash), System.Diagnostics.Process (dotnet)]
```
Stack Trace :
```
at System.Diagnostics.Tests.ProcessTests.LongProcessNamesAreSupported() in /__w/1/s/src/System.Diagnostics.Process/tests/ProcessTests.cs:line 1883
```
Details:
https://mc.dot.net/#/product/netcore/30/source/official~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/type/test~2Ffunctional~2Fcli~2F/build/20190425.7/workItem/System.Diagnostics.Process.Tests/analysis/xunit/System.Diagnostics.Tests.ProcessTests~2FLongProcessNamesAreSupported
|
1.0
|
Test Failure: System.Diagnostics.Tests.ProcessTests/LongProcessNamesAreSupported - Test **System.Diagnostics.Tests.ProcessTests/LongProcessNamesAreSupported** has failed.
Message :
```
Assert.Contains() Failure
Not found: (filter expression)
In value: Process[] [System.Diagnostics.Process (sh), System.Diagnostics.Process (sh), System.Diagnostics.Process (bash), System.Diagnostics.Process (dotnet)]
```
Stack Trace :
```
at System.Diagnostics.Tests.ProcessTests.LongProcessNamesAreSupported() in /__w/1/s/src/System.Diagnostics.Process/tests/ProcessTests.cs:line 1883
```
Details:
https://mc.dot.net/#/product/netcore/30/source/official~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/type/test~2Ffunctional~2Fcli~2F/build/20190425.7/workItem/System.Diagnostics.Process.Tests/analysis/xunit/System.Diagnostics.Tests.ProcessTests~2FLongProcessNamesAreSupported
|
process
|
test failure system diagnostics tests processtests longprocessnamesaresupported test system diagnostics tests processtests longprocessnamesaresupported has failed message assert contains failure not found filter expression in value process stack trace at system diagnostics tests processtests longprocessnamesaresupported in w s src system diagnostics process tests processtests cs line details
| 1
|
59,557
| 7,260,872,725
|
IssuesEvent
|
2018-02-18 15:07:40
|
taniman/profit-trailer
|
https://api.github.com/repos/taniman/profit-trailer
|
closed
|
Make PT run as a service! Include console in web gui too!
|
working as designed
|
I'm so sick of having to re-run the program every time my server restarts for updates. Please make this run as a service as it should have been in the first place, and output all console to a rotated log file and also into a page on the webgui. Group the feeder addon console output into the same place as well.
I've successfully ran pt as a service using: https://github.com/kohsuke/winsw/releases
Make it "part of the package" perhaps?
|
1.0
|
Make PT run as a service! Include console in web gui too! - I'm so sick of having to re-run the program every time my server restarts for updates. Please make this run as a service as it should have been in the first place, and output all console to a rotated log file and also into a page on the webgui. Group the feeder addon console output into the same place as well.
I've successfully ran pt as a service using: https://github.com/kohsuke/winsw/releases
Make it "part of the package" perhaps?
|
non_process
|
make pt run as a service include console in web gui too i m so sick of having to re run the program every time my server restarts for updates please make this run as a service as it should have been in the first place and output all console to a rotated log file and also into a page on the webgui group the feeder addon console output into the same place as well i ve successfully ran pt as a service using make it part of the package perhaps
| 0
|
160,641
| 6,101,277,639
|
IssuesEvent
|
2017-06-20 14:20:05
|
kuzzleio/kuzzle
|
https://api.github.com/repos/kuzzleio/kuzzle
|
closed
|
Login and Logout should not be GET methods
|
bug priority-blocking
|
Currently one of login's route and the logout route are GET methods in HTTP:
```
{verb: 'get', url: '/_login/:strategy', controller: 'auth', action: 'login'},
{verb: 'get', url: '/_logout', controller: 'auth', action: 'logout'},
```
Both should (only) be POST methods.
By the way the login routes where strategy is provided in the url don't work at the moment.
After discussion, only the POST route without strategy in the url should be kept, but the body structure should be adapted to something like:
```json
{
"local": {
"username": "MyUser",
"password": "MyPassword"
},
"anotherStrategy": {...}
}
|
1.0
|
Login and Logout should not be GET methods - Currently one of login's route and the logout route are GET methods in HTTP:
```
{verb: 'get', url: '/_login/:strategy', controller: 'auth', action: 'login'},
{verb: 'get', url: '/_logout', controller: 'auth', action: 'logout'},
```
Both should (only) be POST methods.
By the way the login routes where strategy is provided in the url don't work at the moment.
After discussion, only the POST route without strategy in the url should be kept, but the body structure should be adapted to something like:
```json
{
"local": {
"username": "MyUser",
"password": "MyPassword"
},
"anotherStrategy": {...}
}
|
non_process
|
login and logout should not be get methods currently one of login s route and the logout route are get methods in http verb get url login strategy controller auth action login verb get url logout controller auth action logout both should only be post methods by the way the login routes where strategy is provided in the url don t work at the moment after discussion only the post route without strategy in the url should be kept but the body structure should be adapted to something like json local username myuser password mypassword anotherstrategy
| 0
|
343,316
| 10,328,030,528
|
IssuesEvent
|
2019-09-02 08:34:18
|
pmem/issues
|
https://api.github.com/repos/pmem/issues
|
opened
|
Test: ex_libpmemobj/TEST20: SETUP (check/pmem/debug/memcheck) fails
|
Exposure: Low OS: Linux Priority: 3 medium Type: Bug
|
<!--
Before creating new issue, ensure that similar issue wasn't already created
* Search: https://github.com/pmem/issues/issues
Note that if you do not provide enough information to reproduce the issue, we may not be able to take action on your report.
Remember this is just a minimal template. You can extend it with data you think may be useful.
-->
# ISSUE: <!-- fill the title of issue -->
## Environment Information
- PMDK package version(s): 1.6-352-g7e4312dd5
- OS(es) version(s): Fedora30
- ndctl version(s): 65
- kernel version(s): 5.1.17-300.fc30.x86_64
## Please provide a reproduction of the bug:
```
$ ./RUNTESTS ex_libpmemobj -s TEST20 -m force-enable
```
## How often bug is revealed: (always, often, rare): always
## Actual behavior:
```
$ ./RUNTESTS ex_libpmemobj -s TEST20 -m force-enable
ex_libpmemobj/TEST20: SETUP (check/pmem/debug/memcheck)
ex_libpmemobj/TEST20 failed with Valgrind. See memcheck20.log. Last 20 lines below.
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x4889B99: pmemobj_persist (obj.c:2711)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x1099C5: realloc_int (array.c:244)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x10A33D: do_realloc (array.c:447)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x10A6AE: main (array.c:523)
ex_libpmemobj/TEST20 memcheck20.log ==85992== Address 0x59c15c0 is in a rw- mapped file /mnt/mem/test_ex_libpmemobj20⠏⠍⠙⠅ɗPMDKӜ⥺/testfile segment
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== HEAP SUMMARY:
ex_libpmemobj/TEST20 memcheck20.log ==85992== in use at exit: 552 bytes in 1 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== total heap usage: 37,339 allocs, 37,303 frees, 60,411,462 bytes allocated
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== LEAK SUMMARY:
ex_libpmemobj/TEST20 memcheck20.log ==85992== definitely lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== indirectly lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== possibly lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== still reachable: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== suppressed: 552 bytes in 1 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== For lists of detected and suppressed errors, rerun with: -s
ex_libpmemobj/TEST20 memcheck20.log ==85992== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 0 from 0)
RUNTESTS: stopping: ex_libpmemobj/TEST20 failed, TEST=check FS=pmem BUILD=debug
```
## Expected behavior:
Test should pass.
## Details
Logs:
[memcheck20.log](https://github.com/pmem/issues/files/3565122/memcheck20.log)
[out20.log](https://github.com/pmem/issues/files/3565123/out20.log)
[pmem20.log](https://github.com/pmem/issues/files/3565124/pmem20.log)
[pmemobj20.log](https://github.com/pmem/issues/files/3565125/pmemobj20.log)
## Additional information about Priority and Help Requested:
Are you willing to submit a pull request with a proposed change? (Yes, No) <!-- check one if possible -->
Requested priority: (Showstopper, High, Medium, Low) <!-- check one if possible -->
|
1.0
|
Test: ex_libpmemobj/TEST20: SETUP (check/pmem/debug/memcheck) fails - <!--
Before creating new issue, ensure that similar issue wasn't already created
* Search: https://github.com/pmem/issues/issues
Note that if you do not provide enough information to reproduce the issue, we may not be able to take action on your report.
Remember this is just a minimal template. You can extend it with data you think may be useful.
-->
# ISSUE: <!-- fill the title of issue -->
## Environment Information
- PMDK package version(s): 1.6-352-g7e4312dd5
- OS(es) version(s): Fedora30
- ndctl version(s): 65
- kernel version(s): 5.1.17-300.fc30.x86_64
## Please provide a reproduction of the bug:
```
$ ./RUNTESTS ex_libpmemobj -s TEST20 -m force-enable
```
## How often bug is revealed: (always, often, rare): always
## Actual behavior:
```
$ ./RUNTESTS ex_libpmemobj -s TEST20 -m force-enable
ex_libpmemobj/TEST20: SETUP (check/pmem/debug/memcheck)
ex_libpmemobj/TEST20 failed with Valgrind. See memcheck20.log. Last 20 lines below.
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x4889B99: pmemobj_persist (obj.c:2711)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x1099C5: realloc_int (array.c:244)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x10A33D: do_realloc (array.c:447)
ex_libpmemobj/TEST20 memcheck20.log ==85992== by 0x10A6AE: main (array.c:523)
ex_libpmemobj/TEST20 memcheck20.log ==85992== Address 0x59c15c0 is in a rw- mapped file /mnt/mem/test_ex_libpmemobj20⠏⠍⠙⠅ɗPMDKӜ⥺/testfile segment
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== HEAP SUMMARY:
ex_libpmemobj/TEST20 memcheck20.log ==85992== in use at exit: 552 bytes in 1 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== total heap usage: 37,339 allocs, 37,303 frees, 60,411,462 bytes allocated
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== LEAK SUMMARY:
ex_libpmemobj/TEST20 memcheck20.log ==85992== definitely lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== indirectly lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== possibly lost: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== still reachable: 0 bytes in 0 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992== suppressed: 552 bytes in 1 blocks
ex_libpmemobj/TEST20 memcheck20.log ==85992==
ex_libpmemobj/TEST20 memcheck20.log ==85992== For lists of detected and suppressed errors, rerun with: -s
ex_libpmemobj/TEST20 memcheck20.log ==85992== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 0 from 0)
RUNTESTS: stopping: ex_libpmemobj/TEST20 failed, TEST=check FS=pmem BUILD=debug
```
## Expected behavior:
Test should pass.
## Details
Logs:
[memcheck20.log](https://github.com/pmem/issues/files/3565122/memcheck20.log)
[out20.log](https://github.com/pmem/issues/files/3565123/out20.log)
[pmem20.log](https://github.com/pmem/issues/files/3565124/pmem20.log)
[pmemobj20.log](https://github.com/pmem/issues/files/3565125/pmemobj20.log)
## Additional information about Priority and Help Requested:
Are you willing to submit a pull request with a proposed change? (Yes, No) <!-- check one if possible -->
Requested priority: (Showstopper, High, Medium, Low) <!-- check one if possible -->
|
non_process
|
test ex libpmemobj setup check pmem debug memcheck fails before creating new issue ensure that similar issue wasn t already created search note that if you do not provide enough information to reproduce the issue we may not be able to take action on your report remember this is just a minimal template you can extend it with data you think may be useful issue environment information pmdk package version s os es version s ndctl version s kernel version s please provide a reproduction of the bug runtests ex libpmemobj s m force enable how often bug is revealed always often rare always actual behavior runtests ex libpmemobj s m force enable ex libpmemobj setup check pmem debug memcheck ex libpmemobj failed with valgrind see log last lines below ex libpmemobj log by pmemobj persist obj c ex libpmemobj log by realloc int array c ex libpmemobj log by do realloc array c ex libpmemobj log by main array c ex libpmemobj log address is in a rw mapped file mnt mem test ex ⠏⠍⠙⠅ɗpmdkӝ⥺ testfile segment ex libpmemobj log ex libpmemobj log ex libpmemobj log heap summary ex libpmemobj log in use at exit bytes in blocks ex libpmemobj log total heap usage allocs frees bytes allocated ex libpmemobj log ex libpmemobj log leak summary ex libpmemobj log definitely lost bytes in blocks ex libpmemobj log indirectly lost bytes in blocks ex libpmemobj log possibly lost bytes in blocks ex libpmemobj log still reachable bytes in blocks ex libpmemobj log suppressed bytes in blocks ex libpmemobj log ex libpmemobj log for lists of detected and suppressed errors rerun with s ex libpmemobj log error summary errors from contexts suppressed from runtests stopping ex libpmemobj failed test check fs pmem build debug expected behavior test should pass details logs additional information about priority and help requested are you willing to submit a pull request with a proposed change yes no requested priority showstopper high medium low
| 0
|
138,517
| 18,793,945,852
|
IssuesEvent
|
2021-11-08 19:54:36
|
Dima2022/hygieia-codequality-sonar-collector
|
https://api.github.com/repos/Dima2022/hygieia-codequality-sonar-collector
|
opened
|
CVE-2018-1000180 (High) detected in bcprov-jdk15on-1.55.jar
|
security vulnerability
|
## CVE-2018-1000180 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bcprov-jdk15on-1.55.jar</b></p></summary>
<p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 to JDK 1.8.</p>
<p>Library home page: <a href="http://www.bouncycastle.org/java.html">http://www.bouncycastle.org/java.html</a></p>
<p>Path to dependency file: hygieia-codequality-sonar-collector/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.55/bcprov-jdk15on-1.55.jar</p>
<p>
Dependency Hierarchy:
- core-3.1.11.jar (Root Library)
- spring-cloud-starter-config-1.3.1.RELEASE.jar
- spring-cloud-starter-1.2.2.RELEASE.jar
- spring-security-rsa-1.0.3.RELEASE.jar
- bcpkix-jdk15on-1.55.jar
- :x: **bcprov-jdk15on-1.55.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/hygieia-codequality-sonar-collector/commit/d72cd85b78442d6e5c56f0a28d43e8922826f909">d72cd85b78442d6e5c56f0a28d43e8922826f909</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Bouncy Castle BC 1.54 - 1.59, BC-FJA 1.0.0, BC-FJA 1.0.1 and earlier have a flaw in the Low-level interface to RSA key pair generator, specifically RSA Key Pairs generated in low-level API with added certainty may have less M-R tests than expected. This appears to be fixed in versions BC 1.60 beta 4 and later, BC-FJA 1.0.2 and later.
<p>Publish Date: 2018-06-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000180>CVE-2018-1000180</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000180">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000180</a></p>
<p>Release Date: 2018-06-05</p>
<p>Fix Resolution: org.bouncycastle:bcprov-jdk15on:1.60,org.bouncycastle:bcprov-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk15on:1.60</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.bouncycastle","packageName":"bcprov-jdk15on","packageVersion":"1.55","packageFilePaths":["/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.capitalone.dashboard:core:3.1.11;org.springframework.cloud:spring-cloud-starter-config:1.3.1.RELEASE;org.springframework.cloud:spring-cloud-starter:1.2.2.RELEASE;org.springframework.security:spring-security-rsa:1.0.3.RELEASE;org.bouncycastle:bcpkix-jdk15on:1.55;org.bouncycastle:bcprov-jdk15on:1.55","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.bouncycastle:bcprov-jdk15on:1.60,org.bouncycastle:bcprov-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk15on:1.60"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-1000180","vulnerabilityDetails":"Bouncy Castle BC 1.54 - 1.59, BC-FJA 1.0.0, BC-FJA 1.0.1 and earlier have a flaw in the Low-level interface to RSA key pair generator, specifically RSA Key Pairs generated in low-level API with added certainty may have less M-R tests than expected. This appears to be fixed in versions BC 1.60 beta 4 and later, BC-FJA 1.0.2 and later.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000180","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-1000180 (High) detected in bcprov-jdk15on-1.55.jar - ## CVE-2018-1000180 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bcprov-jdk15on-1.55.jar</b></p></summary>
<p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 to JDK 1.8.</p>
<p>Library home page: <a href="http://www.bouncycastle.org/java.html">http://www.bouncycastle.org/java.html</a></p>
<p>Path to dependency file: hygieia-codequality-sonar-collector/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.55/bcprov-jdk15on-1.55.jar</p>
<p>
Dependency Hierarchy:
- core-3.1.11.jar (Root Library)
- spring-cloud-starter-config-1.3.1.RELEASE.jar
- spring-cloud-starter-1.2.2.RELEASE.jar
- spring-security-rsa-1.0.3.RELEASE.jar
- bcpkix-jdk15on-1.55.jar
- :x: **bcprov-jdk15on-1.55.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/hygieia-codequality-sonar-collector/commit/d72cd85b78442d6e5c56f0a28d43e8922826f909">d72cd85b78442d6e5c56f0a28d43e8922826f909</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Bouncy Castle BC 1.54 - 1.59, BC-FJA 1.0.0, BC-FJA 1.0.1 and earlier have a flaw in the Low-level interface to RSA key pair generator, specifically RSA Key Pairs generated in low-level API with added certainty may have less M-R tests than expected. This appears to be fixed in versions BC 1.60 beta 4 and later, BC-FJA 1.0.2 and later.
<p>Publish Date: 2018-06-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000180>CVE-2018-1000180</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000180">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000180</a></p>
<p>Release Date: 2018-06-05</p>
<p>Fix Resolution: org.bouncycastle:bcprov-jdk15on:1.60,org.bouncycastle:bcprov-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk15on:1.60</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.bouncycastle","packageName":"bcprov-jdk15on","packageVersion":"1.55","packageFilePaths":["/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.capitalone.dashboard:core:3.1.11;org.springframework.cloud:spring-cloud-starter-config:1.3.1.RELEASE;org.springframework.cloud:spring-cloud-starter:1.2.2.RELEASE;org.springframework.security:spring-security-rsa:1.0.3.RELEASE;org.bouncycastle:bcpkix-jdk15on:1.55;org.bouncycastle:bcprov-jdk15on:1.55","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.bouncycastle:bcprov-jdk15on:1.60,org.bouncycastle:bcprov-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk14:1.60,org.bouncycastle:bcprov-ext-jdk15on:1.60"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-1000180","vulnerabilityDetails":"Bouncy Castle BC 1.54 - 1.59, BC-FJA 1.0.0, BC-FJA 1.0.1 and earlier have a flaw in the Low-level interface to RSA key pair generator, specifically RSA Key Pairs generated in low-level API with added certainty may have less M-R tests than expected. This appears to be fixed in versions BC 1.60 beta 4 and later, BC-FJA 1.0.2 and later.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000180","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in bcprov jar cve high severity vulnerability vulnerable library bcprov jar the bouncy castle crypto package is a java implementation of cryptographic algorithms this jar contains jce provider and lightweight api for the bouncy castle cryptography apis for jdk to jdk library home page a href path to dependency file hygieia codequality sonar collector pom xml path to vulnerable library home wss scanner repository org bouncycastle bcprov bcprov jar dependency hierarchy core jar root library spring cloud starter config release jar spring cloud starter release jar spring security rsa release jar bcpkix jar x bcprov jar vulnerable library found in head commit a href found in base branch master vulnerability details bouncy castle bc bc fja bc fja and earlier have a flaw in the low level interface to rsa key pair generator specifically rsa key pairs generated in low level api with added certainty may have less m r tests than expected this appears to be fixed in versions bc beta and later bc fja and later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org bouncycastle bcprov org bouncycastle bcprov org bouncycastle bcprov ext org bouncycastle bcprov ext isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com capitalone dashboard core org springframework cloud spring cloud starter config release org springframework cloud spring cloud starter release org springframework security spring security rsa release org bouncycastle bcpkix org bouncycastle bcprov isminimumfixversionavailable true minimumfixversion org bouncycastle bcprov org bouncycastle bcprov org bouncycastle bcprov ext org bouncycastle bcprov ext basebranches vulnerabilityidentifier cve vulnerabilitydetails bouncy castle bc bc fja bc fja and earlier have a flaw in the low level interface to rsa key pair generator specifically rsa key pairs generated in low level api with added certainty may have less m r tests than expected this appears to be fixed in versions bc beta and later bc fja and later vulnerabilityurl
| 0
|
7,614
| 10,724,138,458
|
IssuesEvent
|
2019-10-27 23:41:54
|
input-output-hk/fm-ouroboros
|
https://api.github.com/repos/input-output-hk/fm-ouroboros
|
opened
|
Add the prefix `absorb_` to the names of the `absorb` axioms
|
language: isabelle topic: process calculus type: improvement
|
Our goal is to add the prefix `absorb_` to the names of the `absorb` axioms to make it clear that these axioms are about `absorb` and to make the naming consistent with the naming of the `lift` axioms and the core bisimilarities.
|
1.0
|
Add the prefix `absorb_` to the names of the `absorb` axioms - Our goal is to add the prefix `absorb_` to the names of the `absorb` axioms to make it clear that these axioms are about `absorb` and to make the naming consistent with the naming of the `lift` axioms and the core bisimilarities.
|
process
|
add the prefix absorb to the names of the absorb axioms our goal is to add the prefix absorb to the names of the absorb axioms to make it clear that these axioms are about absorb and to make the naming consistent with the naming of the lift axioms and the core bisimilarities
| 1
|
109
| 2,545,825,488
|
IssuesEvent
|
2015-01-29 19:38:45
|
dalehenrich/metacello-work
|
https://api.github.com/repos/dalehenrich/metacello-work
|
opened
|
Copying a filetree package with embedded multi-byte characters to an .mcz is trouble
|
in process
|
see the [mail from Pieter](https://groups.google.com/d/msg/metacello/f8b72lB831o/7KZ2smIV_8IJ) for details
|
1.0
|
Copying a filetree package with embedded multi-byte characters to an .mcz is trouble - see the [mail from Pieter](https://groups.google.com/d/msg/metacello/f8b72lB831o/7KZ2smIV_8IJ) for details
|
process
|
copying a filetree package with embedded multi byte characters to an mcz is trouble see the for details
| 1
|
8,691
| 11,835,065,436
|
IssuesEvent
|
2020-03-23 10:00:59
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x.
|
process: dependencies
|
I try to install cypress next command from documentation:
`yarn add cypress --dev`
I see error:
```
cypress > extract-zip > mkdirp@0.5.1: Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x. (Note that the API surface has changed to use Promises in 1.x.)
```
But I don't have mkdirp in my dependency
Node: v10.19.0
Npm: v6.13.4
Yarn: 1.21.1
|
1.0
|
Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x. - I try to install cypress next command from documentation:
`yarn add cypress --dev`
I see error:
```
cypress > extract-zip > mkdirp@0.5.1: Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x. (Note that the API surface has changed to use Promises in 1.x.)
```
But I don't have mkdirp in my dependency
Node: v10.19.0
Npm: v6.13.4
Yarn: 1.21.1
|
process
|
legacy versions of mkdirp are no longer supported please update to mkdirp x i try to install cypress next command from documentation yarn add cypress dev i see error cypress extract zip mkdirp legacy versions of mkdirp are no longer supported please update to mkdirp x note that the api surface has changed to use promises in x but i don t have mkdirp in my dependency node npm yarn
| 1
|
160,185
| 25,120,138,685
|
IssuesEvent
|
2022-11-09 07:19:53
|
npocccties/chiloportal
|
https://api.github.com/repos/npocccties/chiloportal
|
closed
|
バッジ画像のAdobe Illustrator ファイルを共有する
|
design MUST
|
@ties-makimura さんに画像作成用のaiファイルをお渡しする
その際、知識バッジのフォーマットも用意する。
|
1.0
|
バッジ画像のAdobe Illustrator ファイルを共有する - @ties-makimura さんに画像作成用のaiファイルをお渡しする
その際、知識バッジのフォーマットも用意する。
|
non_process
|
バッジ画像のadobe illustrator ファイルを共有する ties makimura さんに画像作成用のaiファイルをお渡しする その際、知識バッジのフォーマットも用意する。
| 0
|
299,608
| 9,205,666,153
|
IssuesEvent
|
2019-03-08 11:15:49
|
qissue-bot/QGIS
|
https://api.github.com/repos/qissue-bot/QGIS
|
closed
|
Attribute Table - combine Start/Stop Editing buttons into one button, like Toggle editing in main window
|
Category: GUI Component: Easy fix? Component: Pull Request or Patch supplied Component: Resolution Priority: Low Project: QGIS Application Status: Closed Tracker: Feature request
|
---
Author Name: **Steven Mizuno** (Steven Mizuno)
Original Redmine Issue: 982, https://issues.qgis.org/issues/982
Original Assignee: nobody -
---
Attribute Table - combine Start/Stop Editing buttons into one button, like Toggle Editing for features in main window. Also use an icon similar to the pencil, but maybe should be different from
the feature editing to minimize confusion.
This would make for a more consistent user experience.
|
1.0
|
Attribute Table - combine Start/Stop Editing buttons into one button, like Toggle editing in main window - ---
Author Name: **Steven Mizuno** (Steven Mizuno)
Original Redmine Issue: 982, https://issues.qgis.org/issues/982
Original Assignee: nobody -
---
Attribute Table - combine Start/Stop Editing buttons into one button, like Toggle Editing for features in main window. Also use an icon similar to the pencil, but maybe should be different from
the feature editing to minimize confusion.
This would make for a more consistent user experience.
|
non_process
|
attribute table combine start stop editing buttons into one button like toggle editing in main window author name steven mizuno steven mizuno original redmine issue original assignee nobody attribute table combine start stop editing buttons into one button like toggle editing for features in main window also use an icon similar to the pencil but maybe should be different from the feature editing to minimize confusion this would make for a more consistent user experience
| 0
|
5,071
| 7,869,685,490
|
IssuesEvent
|
2018-06-24 16:47:09
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Validating environment variables
|
feature request meta process semver-minor
|
Right now we have no validation of our environment variables even though a couple of these only accept a limited set of values.
I propose that we add a validation hook and when a environment variable is set a JS function is triggered to verify the value. I would also like to expose the hook to users so they are able to validate the environment variables used in e.g. different modes (running the code in production might accept less values than when run in other modes). Trying to add a hook twice should result in an error.
If this is something that others like as an idea, I would go ahead and open a PR to implement this.
I suggest a similar API to:
```js
process.addEnvVariableValidation('name', (entry) => {
if (entry === 'foobar') {
// do stuff
} else if (entry === 'baz') {
// do something else
} else {
throw new Error('Not accepted environment variable entry')
}
})
```
|
1.0
|
Validating environment variables - Right now we have no validation of our environment variables even though a couple of these only accept a limited set of values.
I propose that we add a validation hook and when a environment variable is set a JS function is triggered to verify the value. I would also like to expose the hook to users so they are able to validate the environment variables used in e.g. different modes (running the code in production might accept less values than when run in other modes). Trying to add a hook twice should result in an error.
If this is something that others like as an idea, I would go ahead and open a PR to implement this.
I suggest a similar API to:
```js
process.addEnvVariableValidation('name', (entry) => {
if (entry === 'foobar') {
// do stuff
} else if (entry === 'baz') {
// do something else
} else {
throw new Error('Not accepted environment variable entry')
}
})
```
|
process
|
validating environment variables right now we have no validation of our environment variables even though a couple of these only accept a limited set of values i propose that we add a validation hook and when a environment variable is set a js function is triggered to verify the value i would also like to expose the hook to users so they are able to validate the environment variables used in e g different modes running the code in production might accept less values than when run in other modes trying to add a hook twice should result in an error if this is something that others like as an idea i would go ahead and open a pr to implement this i suggest a similar api to js process addenvvariablevalidation name entry if entry foobar do stuff else if entry baz do something else else throw new error not accepted environment variable entry
| 1
|
73,591
| 7,345,402,146
|
IssuesEvent
|
2018-03-07 17:18:13
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
teamcity: failed tests on master: Jepsen/Jepsen
|
Robot test-failure
|
The following tests appear to have failed:
[#550400](https://teamcity.cockroachdb.com/viewLog.html?buildId=550400):
```
--- FAIL: Jepsen/Jepsen: JepsenBank: JepsenBank/majority-ring (32.815s)
--- FAIL: Jepsen/Jepsen: JepsenRegister: JepsenRegister/parts+start-kill-2 (14.937s)
```
Please assign, take a look and update the issue accordingly.
|
1.0
|
teamcity: failed tests on master: Jepsen/Jepsen - The following tests appear to have failed:
[#550400](https://teamcity.cockroachdb.com/viewLog.html?buildId=550400):
```
--- FAIL: Jepsen/Jepsen: JepsenBank: JepsenBank/majority-ring (32.815s)
--- FAIL: Jepsen/Jepsen: JepsenRegister: JepsenRegister/parts+start-kill-2 (14.937s)
```
Please assign, take a look and update the issue accordingly.
|
non_process
|
teamcity failed tests on master jepsen jepsen the following tests appear to have failed fail jepsen jepsen jepsenbank jepsenbank majority ring fail jepsen jepsen jepsenregister jepsenregister parts start kill please assign take a look and update the issue accordingly
| 0
|
319,229
| 27,356,857,243
|
IssuesEvent
|
2023-02-27 13:26:56
|
epam/badgerdoc
|
https://api.github.com/repos/epam/badgerdoc
|
closed
|
Validation can be finished only after saving draft
|
bug priority testing
|
**Setup**
An extraction job is creates, all annotation tasks within it are finished.
**Steps**
- Repeat all the steps from the “[Dialog window doesn’t close automatically after validation is finished](https://github.com/epam/badgerdoc/issues/225)” issue.
- Open some other page, then open the same validation task again. “Valid” is not selected.
- Select “Valid” again, click “Save draft” button. A “Saved successfully” message appears in the lower left corner.
- Go back to extraction job using breadcrumbs, status is “In progress”.
- Open the validation task again, now “Valid” is selected.
- Click the “Finish validation” button. A confirmation dialog window appears

Note: correct spelling is “Assign”, not “Asign”.
- Press “Confirm validation” button. User is redirected to dashboard.
- Open the extraction job again, now validation task is in the “Finished” state.
|
1.0
|
Validation can be finished only after saving draft - **Setup**
An extraction job is creates, all annotation tasks within it are finished.
**Steps**
- Repeat all the steps from the “[Dialog window doesn’t close automatically after validation is finished](https://github.com/epam/badgerdoc/issues/225)” issue.
- Open some other page, then open the same validation task again. “Valid” is not selected.
- Select “Valid” again, click “Save draft” button. A “Saved successfully” message appears in the lower left corner.
- Go back to extraction job using breadcrumbs, status is “In progress”.
- Open the validation task again, now “Valid” is selected.
- Click the “Finish validation” button. A confirmation dialog window appears

Note: correct spelling is “Assign”, not “Asign”.
- Press “Confirm validation” button. User is redirected to dashboard.
- Open the extraction job again, now validation task is in the “Finished” state.
|
non_process
|
validation can be finished only after saving draft setup an extraction job is creates all annotation tasks within it are finished steps repeat all the steps from the “ issue open some other page then open the same validation task again “valid” is not selected select “valid” again click “save draft” button a “saved successfully” message appears in the lower left corner go back to extraction job using breadcrumbs status is “in progress” open the validation task again now “valid” is selected click the “finish validation” button a confirmation dialog window appears note correct spelling is “assign” not “asign” press “confirm validation” button user is redirected to dashboard open the extraction job again now validation task is in the “finished” state
| 0
|
21,027
| 27,969,927,487
|
IssuesEvent
|
2023-03-25 00:19:05
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
Liquify should be allowed to change the image's resolution
|
feature: enhancement scope: image processing bug: pending no-issue-activity
|
Hey folks :wave: here's something I observed learning about the liquify tool and trying to apply it to my workflow.
Sometimes I have the use case that in order to get the crop I want, I need to liquify to extend my image e.g. to the right for let's say another 5% or so. At the moment when I use liquify at the border, the image's resolution stays the same and liquify only gets applied to the image's pixels.
In other tools, liquify allows the user to extend the image's dimensions at the border by automatically increasing the image's dimension and then inpainting liquify's output there. How can I achieve this with darktable?
It would be a great addition if liquify could extend the image's dimensions to help users get the crop they want, when they are a bit short at the borders.
Thoughts?
Note: another liquify border issue https://github.com/darktable-org/darktable/issues/8560 that is not directly related.
|
1.0
|
Liquify should be allowed to change the image's resolution - Hey folks :wave: here's something I observed learning about the liquify tool and trying to apply it to my workflow.
Sometimes I have the use case that in order to get the crop I want, I need to liquify to extend my image e.g. to the right for let's say another 5% or so. At the moment when I use liquify at the border, the image's resolution stays the same and liquify only gets applied to the image's pixels.
In other tools, liquify allows the user to extend the image's dimensions at the border by automatically increasing the image's dimension and then inpainting liquify's output there. How can I achieve this with darktable?
It would be a great addition if liquify could extend the image's dimensions to help users get the crop they want, when they are a bit short at the borders.
Thoughts?
Note: another liquify border issue https://github.com/darktable-org/darktable/issues/8560 that is not directly related.
|
process
|
liquify should be allowed to change the image s resolution hey folks wave here s something i observed learning about the liquify tool and trying to apply it to my workflow sometimes i have the use case that in order to get the crop i want i need to liquify to extend my image e g to the right for let s say another or so at the moment when i use liquify at the border the image s resolution stays the same and liquify only gets applied to the image s pixels in other tools liquify allows the user to extend the image s dimensions at the border by automatically increasing the image s dimension and then inpainting liquify s output there how can i achieve this with darktable it would be a great addition if liquify could extend the image s dimensions to help users get the crop they want when they are a bit short at the borders thoughts note another liquify border issue that is not directly related
| 1
|
30,566
| 11,839,655,566
|
IssuesEvent
|
2020-03-23 17:30:12
|
BrianMcDonaldWS/deck.gl
|
https://api.github.com/repos/BrianMcDonaldWS/deck.gl
|
opened
|
CVE-2019-11358 (Medium) detected in jquery-2.1.4.min.js, jquery-1.9.1.js
|
security vulnerability
|
## CVE-2019-11358 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.1.4.min.js</b>, <b>jquery-1.9.1.js</b></p></summary>
<p>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/deck.gl/website/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>Path to vulnerable library: /deck.gl/website/node_modules/js-base64/.attic/test-moment/index.html,/deck.gl/website-gatsby/node_modules/validate.js/index.html,/deck.gl/website-gatsby/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/deck.gl/showcases/wind/node_modules/tinycolor2/index.html</p>
<p>Path to vulnerable library: /deck.gl/showcases/wind/node_modules/tinycolor2/demo/jquery-1.9.1.js,/deck.gl/showcases/wind/node_modules/tinycolor2/test/../demo/jquery-1.9.1.js,/deck.gl/website-gatsby/node_modules/tinycolor2/demo/jquery-1.9.1.js,/deck.gl/website-gatsby/node_modules/tinycolor2/test/../demo/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/BrianMcDonaldWS/deck.gl/commit/67e433f207a0fc9c0fb2b8f7a2906f254c8c4b87">67e433f207a0fc9c0fb2b8f7a2906f254c8c4b87</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.4","isTransitiveDependency":false,"dependencyTree":"jquery:2.1.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.4.0"}],"vulnerabilityIdentifier":"CVE-2019-11358","vulnerabilityDetails":"jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2019-11358 (Medium) detected in jquery-2.1.4.min.js, jquery-1.9.1.js - ## CVE-2019-11358 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.1.4.min.js</b>, <b>jquery-1.9.1.js</b></p></summary>
<p>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/deck.gl/website/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>Path to vulnerable library: /deck.gl/website/node_modules/js-base64/.attic/test-moment/index.html,/deck.gl/website-gatsby/node_modules/validate.js/index.html,/deck.gl/website-gatsby/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/deck.gl/showcases/wind/node_modules/tinycolor2/index.html</p>
<p>Path to vulnerable library: /deck.gl/showcases/wind/node_modules/tinycolor2/demo/jquery-1.9.1.js,/deck.gl/showcases/wind/node_modules/tinycolor2/test/../demo/jquery-1.9.1.js,/deck.gl/website-gatsby/node_modules/tinycolor2/demo/jquery-1.9.1.js,/deck.gl/website-gatsby/node_modules/tinycolor2/test/../demo/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/BrianMcDonaldWS/deck.gl/commit/67e433f207a0fc9c0fb2b8f7a2906f254c8c4b87">67e433f207a0fc9c0fb2b8f7a2906f254c8c4b87</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.4","isTransitiveDependency":false,"dependencyTree":"jquery:2.1.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.4.0"}],"vulnerabilityIdentifier":"CVE-2019-11358","vulnerabilityDetails":"jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jquery min js jquery js cve medium severity vulnerability vulnerable libraries jquery min js jquery js jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm deck gl website node modules js attic test moment index html path to vulnerable library deck gl website node modules js attic test moment index html deck gl website gatsby node modules validate js index html deck gl website gatsby node modules js attic test moment index html dependency hierarchy x jquery min js vulnerable library jquery js javascript library for dom operations library home page a href path to dependency file tmp ws scm deck gl showcases wind node modules index html path to vulnerable library deck gl showcases wind node modules demo jquery js deck gl showcases wind node modules test demo jquery js deck gl website gatsby node modules demo jquery js deck gl website gatsby node modules test demo jquery js dependency hierarchy x jquery js vulnerable library found in head commit a href vulnerability details jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype vulnerabilityurl
| 0
|
138,926
| 18,796,865,359
|
IssuesEvent
|
2021-11-08 23:46:40
|
Dima2022/DiscountsApp
|
https://api.github.com/repos/Dima2022/DiscountsApp
|
opened
|
CVE-2018-8416 (Medium) detected in microsoft.netcore.app.2.1.0.nupkg
|
security vulnerability
|
## CVE-2018-8416 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.netcore.app.2.1.0.nupkg</b></p></summary>
<p>A set of .NET API's that are included in the default .NET Core application model.
caa7b7e2bad98e56a...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: DiscountsApp/SCNDISC.Server/SCNDISC.Server.Core/SCNDISC.Server.Core.csproj</p>
<p>Path to vulnerable library: /packages/microsoft.netcore.app/2.1.0/microsoft.netcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.netcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/DiscountsApp/commit/dc3648254f7b327f09662a4563899eb0e9a6de96">dc3648254f7b327f09662a4563899eb0e9a6de96</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A tampering vulnerability exists when .NET Core improperly handles specially crafted files, aka ".NET Core Tampering Vulnerability." This affects .NET Core 2.1.
<p>Publish Date: 2018-11-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8416>CVE-2018-8416</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/Announcements/issues/95">https://github.com/dotnet/Announcements/issues/95</a></p>
<p>Release Date: 2018-11-14</p>
<p>Fix Resolution: 2.1.7</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.NETCore.App","packageVersion":"2.1.0","packageFilePaths":["/SCNDISC.Server/SCNDISC.Server.Core/SCNDISC.Server.Core.csproj"],"isTransitiveDependency":false,"dependencyTree":"Microsoft.NETCore.App:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.1.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-8416","vulnerabilityDetails":"A tampering vulnerability exists when .NET Core improperly handles specially crafted files, aka \".NET Core Tampering Vulnerability.\" This affects .NET Core 2.1.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8416","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-8416 (Medium) detected in microsoft.netcore.app.2.1.0.nupkg - ## CVE-2018-8416 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.netcore.app.2.1.0.nupkg</b></p></summary>
<p>A set of .NET API's that are included in the default .NET Core application model.
caa7b7e2bad98e56a...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: DiscountsApp/SCNDISC.Server/SCNDISC.Server.Core/SCNDISC.Server.Core.csproj</p>
<p>Path to vulnerable library: /packages/microsoft.netcore.app/2.1.0/microsoft.netcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.netcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/DiscountsApp/commit/dc3648254f7b327f09662a4563899eb0e9a6de96">dc3648254f7b327f09662a4563899eb0e9a6de96</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A tampering vulnerability exists when .NET Core improperly handles specially crafted files, aka ".NET Core Tampering Vulnerability." This affects .NET Core 2.1.
<p>Publish Date: 2018-11-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8416>CVE-2018-8416</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/Announcements/issues/95">https://github.com/dotnet/Announcements/issues/95</a></p>
<p>Release Date: 2018-11-14</p>
<p>Fix Resolution: 2.1.7</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.NETCore.App","packageVersion":"2.1.0","packageFilePaths":["/SCNDISC.Server/SCNDISC.Server.Core/SCNDISC.Server.Core.csproj"],"isTransitiveDependency":false,"dependencyTree":"Microsoft.NETCore.App:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.1.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-8416","vulnerabilityDetails":"A tampering vulnerability exists when .NET Core improperly handles specially crafted files, aka \".NET Core Tampering Vulnerability.\" This affects .NET Core 2.1.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8416","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in microsoft netcore app nupkg cve medium severity vulnerability vulnerable library microsoft netcore app nupkg a set of net api s that are included in the default net core application model library home page a href path to dependency file discountsapp scndisc server scndisc server core scndisc server core csproj path to vulnerable library packages microsoft netcore app microsoft netcore app nupkg dependency hierarchy x microsoft netcore app nupkg vulnerable library found in head commit a href found in base branch master vulnerability details a tampering vulnerability exists when net core improperly handles specially crafted files aka net core tampering vulnerability this affects net core publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree microsoft netcore app isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails a tampering vulnerability exists when net core improperly handles specially crafted files aka net core tampering vulnerability this affects net core vulnerabilityurl
| 0
|
523,829
| 15,190,682,707
|
IssuesEvent
|
2021-02-15 18:24:06
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
mps2_an521: no input to shell from Windows qemu host
|
area: UART bug priority: low
|
**Describe the bug**
Trying to run a sample with shell functionality and have it work in an emulated ARM machine on a Windows host.
The sample runs and prints the shell prompt, but no input from keyboard is possible.
**To Reproduce**
Run in Windows command prompt (cmd):
```
west init
west update
set "GNUARMEMB_TOOLCHAIN_PATH=%ProgramFiles(x86)%\GNU Arm Embedded Toolchain\9 2020-q2-update"
set ZEPHYR_TOOLCHAIN_VARIANT=gnuarmemb
cd zephyr
west build -b mps2_an521 -t run samples\subsys\shell\shell_module -- -DQEMU="%ProgramFiles%\qemu\qemu-system-arm.exe"
```
**Expected behavior**
I expect to be able to enter shell commands and have them executed.
**Impact**
Prevents manual use of shell for testing/dev purposes
**Logs and console output**
```
C:\work\hydra\qemu\zephyr>west build -b mps2_an521 -t run samples\subsys\shell\shell_module -- -DQEMU="%ProgramFiles%\qemu\qemu-system-arm.exe"
-- west build: generating a build system
Including boilerplate (Zephyr base (cached)): C:/work/hydra/qemu/zephyr/cmake/app/boilerplate.cmake
-- Application: C:/work/hydra/qemu/zephyr/samples/subsys/shell/shell_module
-- Zephyr version: 2.4.99 (C:/work/hydra/qemu/zephyr)
-- Found west (found suitable version "0.8.0", minimum required is "0.7.1")
-- Board: mps2_an521
-- Cache files will be written to: C:\Users\walter\AppData\Local/.cache/zephyr
-- Found dtc: C:/ProgramData/chocolatey/bin/dtc.exe (found suitable version "1.4.7", minimum required is "1.4.6")
-- Found toolchain: gnuarmemb (C:/Program Files (x86)/GNU Arm Embedded Toolchain/9 2020-q2-update)
-- Found BOARD.dts: C:/work/hydra/qemu/zephyr/boards/arm/mps2_an521/mps2_an521.dts
-- Generated zephyr.dts: C:/work/hydra/qemu/zephyr/build/zephyr/zephyr.dts
-- Generated devicetree_unfixed.h: C:/work/hydra/qemu/zephyr/build/zephyr/include/generated/devicetree_unfixed.h
Parsing C:/work/hydra/qemu/zephyr/samples/subsys/shell/shell_module/Kconfig
Loaded configuration 'C:/work/hydra/qemu/zephyr/build/zephyr/.config'
No change to configuration in 'C:/work/hydra/qemu/zephyr/build/zephyr/.config'
No change to Kconfig header in 'C:/work/hydra/qemu/zephyr/build/zephyr/include/generated/autoconf.h'
-- Configuring done
-- Generating done
-- Build files have been written to: C:/work/hydra/qemu/zephyr/build
-- west build: running target run
[121/126] Linking C executable zephyr\zephyr_prebuilt.elf
Memory region Used Size Region Size %age Used
FLASH: 54520 B 224 MB 0.02%
SRAM: 11536 B 16 MB 0.07%
IDT_LIST: 200 B 2 KB 9.77%
[125/126] To exit from QEMU enter: 'CTRL+a, x'[QEMU] CPU: cortex-m33
C:\Program Files\qemu\qemu-system-arm.exe: warning: nic lan9118.0 has no peer
uart:~$
```
**Environment (please complete the following information):**
Windows 10
QEMU version 5.1.92 (v5.2.0-rc2-11843-gf571c4ffb5-dirty)
**Additional context**
What's the correct way of specifying the Qemu path? Setting the `QEMU_BIN_PATH` environment variable doesn't seem to work.
We are using the board `mps2_an521` because that seems to be the only emulated board with a Cortex-M33 processor. We would really like to emulate an nRF5340.
|
1.0
|
mps2_an521: no input to shell from Windows qemu host - **Describe the bug**
Trying to run a sample with shell functionality and have it work in an emulated ARM machine on a Windows host.
The sample runs and prints the shell prompt, but no input from keyboard is possible.
**To Reproduce**
Run in Windows command prompt (cmd):
```
west init
west update
set "GNUARMEMB_TOOLCHAIN_PATH=%ProgramFiles(x86)%\GNU Arm Embedded Toolchain\9 2020-q2-update"
set ZEPHYR_TOOLCHAIN_VARIANT=gnuarmemb
cd zephyr
west build -b mps2_an521 -t run samples\subsys\shell\shell_module -- -DQEMU="%ProgramFiles%\qemu\qemu-system-arm.exe"
```
**Expected behavior**
I expect to be able to enter shell commands and have them executed.
**Impact**
Prevents manual use of shell for testing/dev purposes
**Logs and console output**
```
C:\work\hydra\qemu\zephyr>west build -b mps2_an521 -t run samples\subsys\shell\shell_module -- -DQEMU="%ProgramFiles%\qemu\qemu-system-arm.exe"
-- west build: generating a build system
Including boilerplate (Zephyr base (cached)): C:/work/hydra/qemu/zephyr/cmake/app/boilerplate.cmake
-- Application: C:/work/hydra/qemu/zephyr/samples/subsys/shell/shell_module
-- Zephyr version: 2.4.99 (C:/work/hydra/qemu/zephyr)
-- Found west (found suitable version "0.8.0", minimum required is "0.7.1")
-- Board: mps2_an521
-- Cache files will be written to: C:\Users\walter\AppData\Local/.cache/zephyr
-- Found dtc: C:/ProgramData/chocolatey/bin/dtc.exe (found suitable version "1.4.7", minimum required is "1.4.6")
-- Found toolchain: gnuarmemb (C:/Program Files (x86)/GNU Arm Embedded Toolchain/9 2020-q2-update)
-- Found BOARD.dts: C:/work/hydra/qemu/zephyr/boards/arm/mps2_an521/mps2_an521.dts
-- Generated zephyr.dts: C:/work/hydra/qemu/zephyr/build/zephyr/zephyr.dts
-- Generated devicetree_unfixed.h: C:/work/hydra/qemu/zephyr/build/zephyr/include/generated/devicetree_unfixed.h
Parsing C:/work/hydra/qemu/zephyr/samples/subsys/shell/shell_module/Kconfig
Loaded configuration 'C:/work/hydra/qemu/zephyr/build/zephyr/.config'
No change to configuration in 'C:/work/hydra/qemu/zephyr/build/zephyr/.config'
No change to Kconfig header in 'C:/work/hydra/qemu/zephyr/build/zephyr/include/generated/autoconf.h'
-- Configuring done
-- Generating done
-- Build files have been written to: C:/work/hydra/qemu/zephyr/build
-- west build: running target run
[121/126] Linking C executable zephyr\zephyr_prebuilt.elf
Memory region Used Size Region Size %age Used
FLASH: 54520 B 224 MB 0.02%
SRAM: 11536 B 16 MB 0.07%
IDT_LIST: 200 B 2 KB 9.77%
[125/126] To exit from QEMU enter: 'CTRL+a, x'[QEMU] CPU: cortex-m33
C:\Program Files\qemu\qemu-system-arm.exe: warning: nic lan9118.0 has no peer
uart:~$
```
**Environment (please complete the following information):**
Windows 10
QEMU version 5.1.92 (v5.2.0-rc2-11843-gf571c4ffb5-dirty)
**Additional context**
What's the correct way of specifying the Qemu path? Setting the `QEMU_BIN_PATH` environment variable doesn't seem to work.
We are using the board `mps2_an521` because that seems to be the only emulated board with a Cortex-M33 processor. We would really like to emulate an nRF5340.
|
non_process
|
no input to shell from windows qemu host describe the bug trying to run a sample with shell functionality and have it work in an emulated arm machine on a windows host the sample runs and prints the shell prompt but no input from keyboard is possible to reproduce run in windows command prompt cmd west init west update set gnuarmemb toolchain path programfiles gnu arm embedded toolchain update set zephyr toolchain variant gnuarmemb cd zephyr west build b t run samples subsys shell shell module dqemu programfiles qemu qemu system arm exe expected behavior i expect to be able to enter shell commands and have them executed impact prevents manual use of shell for testing dev purposes logs and console output c work hydra qemu zephyr west build b t run samples subsys shell shell module dqemu programfiles qemu qemu system arm exe west build generating a build system including boilerplate zephyr base cached c work hydra qemu zephyr cmake app boilerplate cmake application c work hydra qemu zephyr samples subsys shell shell module zephyr version c work hydra qemu zephyr found west found suitable version minimum required is board cache files will be written to c users walter appdata local cache zephyr found dtc c programdata chocolatey bin dtc exe found suitable version minimum required is found toolchain gnuarmemb c program files gnu arm embedded toolchain update found board dts c work hydra qemu zephyr boards arm dts generated zephyr dts c work hydra qemu zephyr build zephyr zephyr dts generated devicetree unfixed h c work hydra qemu zephyr build zephyr include generated devicetree unfixed h parsing c work hydra qemu zephyr samples subsys shell shell module kconfig loaded configuration c work hydra qemu zephyr build zephyr config no change to configuration in c work hydra qemu zephyr build zephyr config no change to kconfig header in c work hydra qemu zephyr build zephyr include generated autoconf h configuring done generating done build files have been written to c work hydra qemu zephyr build west build running target run linking c executable zephyr zephyr prebuilt elf memory region used size region size age used flash b mb sram b mb idt list b kb to exit from qemu enter ctrl a x cpu cortex c program files qemu qemu system arm exe warning nic has no peer uart environment please complete the following information windows qemu version dirty additional context what s the correct way of specifying the qemu path setting the qemu bin path environment variable doesn t seem to work we are using the board because that seems to be the only emulated board with a cortex processor we would really like to emulate an
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.