Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
445,202
| 31,169,599,026
|
IssuesEvent
|
2023-08-16 23:20:37
|
grafana/plugin-tools
|
https://api.github.com/repos/grafana/plugin-tools
|
opened
|
Doc: Consider location of advanced topics
|
documentation enhancement
|
### Which areas does this feature request relate to
- [ ] Create Plugin
- [ ] Sign Plugin
- [X] Documentation
### Problem
Do the "nested plugins" and "extend configurations" topics makes sense in Develop given that they relate to config rather than capability of a given plugin?
### Solution
Consider alternatives.
### Alternatives
_No response_
### Additional context
_No response_
### Are you interested in contributing the solution?
- [ ] Yes
- [ ] No
|
1.0
|
Doc: Consider location of advanced topics - ### Which areas does this feature request relate to
- [ ] Create Plugin
- [ ] Sign Plugin
- [X] Documentation
### Problem
Do the "nested plugins" and "extend configurations" topics makes sense in Develop given that they relate to config rather than capability of a given plugin?
### Solution
Consider alternatives.
### Alternatives
_No response_
### Additional context
_No response_
### Are you interested in contributing the solution?
- [ ] Yes
- [ ] No
|
non_process
|
doc consider location of advanced topics which areas does this feature request relate to create plugin sign plugin documentation problem do the nested plugins and extend configurations topics makes sense in develop given that they relate to config rather than capability of a given plugin solution consider alternatives alternatives no response additional context no response are you interested in contributing the solution yes no
| 0
|
204,664
| 23,270,003,553
|
IssuesEvent
|
2022-08-04 21:39:46
|
sourcegraph/sourcegraph
|
https://api.github.com/repos/sourcegraph/sourcegraph
|
closed
|
Security: 3rd party dependency management
|
team/security vuln-scanning
|
This is a placeholder for the upcoming work of managing our 3rd party dependencies. At least ensuring we are not running vulnerable versions. It's likely that we will need multiple tools to cover different technologies. For example, GitHub's dependabot is a good solution for our JS dependencies but it doesn't support Golang.
|
True
|
Security: 3rd party dependency management - This is a placeholder for the upcoming work of managing our 3rd party dependencies. At least ensuring we are not running vulnerable versions. It's likely that we will need multiple tools to cover different technologies. For example, GitHub's dependabot is a good solution for our JS dependencies but it doesn't support Golang.
|
non_process
|
security party dependency management this is a placeholder for the upcoming work of managing our party dependencies at least ensuring we are not running vulnerable versions it s likely that we will need multiple tools to cover different technologies for example github s dependabot is a good solution for our js dependencies but it doesn t support golang
| 0
|
77,384
| 15,529,022,103
|
IssuesEvent
|
2021-03-13 13:32:57
|
jonathan-wiens/hwr-gatsby-kurs-19a-1
|
https://api.github.com/repos/jonathan-wiens/hwr-gatsby-kurs-19a-1
|
opened
|
CVE-2020-24025 (Medium) detected in node-sass-4.14.1.tgz
|
security vulnerability
|
## CVE-2020-24025 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.14.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p>
<p>Path to dependency file: hwr-gatsby-kurs-19a-1/package.json</p>
<p>Path to vulnerable library: hwr-gatsby-kurs-19a-1/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.14.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jonathan-wiens/hwr-gatsby-kurs-19a-1/commit/e81f7fed83f83e9be272a2d4c864980522232df7">e81f7fed83f83e9be272a2d4c864980522232df7</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-24025 (Medium) detected in node-sass-4.14.1.tgz - ## CVE-2020-24025 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.14.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p>
<p>Path to dependency file: hwr-gatsby-kurs-19a-1/package.json</p>
<p>Path to vulnerable library: hwr-gatsby-kurs-19a-1/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.14.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jonathan-wiens/hwr-gatsby-kurs-19a-1/commit/e81f7fed83f83e9be272a2d4c864980522232df7">e81f7fed83f83e9be272a2d4c864980522232df7</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in node sass tgz cve medium severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file hwr gatsby kurs package json path to vulnerable library hwr gatsby kurs node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href found in base branch main vulnerability details certificate validation in node sass to is disabled when requesting binaries even if the user is not specifying an alternative download path publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href step up your open source security game with whitesource
| 0
|
422,267
| 12,269,136,688
|
IssuesEvent
|
2020-05-07 13:38:23
|
luna/enso
|
https://api.github.com/repos/luna/enso
|
closed
|
Research into Reflection-Based Java Interop
|
Category: Backend Category: RTS Change: Non-Breaking Difficulty: Core Contributor Priority: High Type: Enhancement
|
### Summary
In order to best understand how to provide our users (and ourselves) with a high-level Java FFI, some research needs to be done. This research needs to understand both the user impacts of various approaches, and the implementation and maintenance burden they would pose.
### Value
We will have some idea of how to move forward with providing a Java FFI.
### Specification
- [ ] Research how to do dynamic reflection within Graal, including reading the docs.
- [ ] Provide some code experiments with the reflection API.
- [ ] Spend some time reading GraalJS, and maybe talking to the Graal team.
- [ ] Determine if it would be possible (and how) to pass Enso functions to Java code (callbacks, and so on).
- [ ] Record the research findings in the design docs.
### Acceptance Criteria & Test Cases
- The above open questions have been answered.
- All of the findings have been recorded in the design documentation.
|
1.0
|
Research into Reflection-Based Java Interop - ### Summary
In order to best understand how to provide our users (and ourselves) with a high-level Java FFI, some research needs to be done. This research needs to understand both the user impacts of various approaches, and the implementation and maintenance burden they would pose.
### Value
We will have some idea of how to move forward with providing a Java FFI.
### Specification
- [ ] Research how to do dynamic reflection within Graal, including reading the docs.
- [ ] Provide some code experiments with the reflection API.
- [ ] Spend some time reading GraalJS, and maybe talking to the Graal team.
- [ ] Determine if it would be possible (and how) to pass Enso functions to Java code (callbacks, and so on).
- [ ] Record the research findings in the design docs.
### Acceptance Criteria & Test Cases
- The above open questions have been answered.
- All of the findings have been recorded in the design documentation.
|
non_process
|
research into reflection based java interop summary in order to best understand how to provide our users and ourselves with a high level java ffi some research needs to be done this research needs to understand both the user impacts of various approaches and the implementation and maintenance burden they would pose value we will have some idea of how to move forward with providing a java ffi specification research how to do dynamic reflection within graal including reading the docs provide some code experiments with the reflection api spend some time reading graaljs and maybe talking to the graal team determine if it would be possible and how to pass enso functions to java code callbacks and so on record the research findings in the design docs acceptance criteria test cases the above open questions have been answered all of the findings have been recorded in the design documentation
| 0
|
6,236
| 9,181,448,332
|
IssuesEvent
|
2019-03-05 10:15:09
|
decidim/decidim
|
https://api.github.com/repos/decidim/decidim
|
closed
|
Order processes on the processes page
|
space: processes type: bug
|
**Describe the bug**
Process cards do not follow any deductible criteria when it comes to showing on the processes page. However, if you are logged in, they are displayed by up date creation.
I think they should always be displayed according to the same criterion, creation date, for example.
https://meta.decidim.org/processes/bug-report/f/210/proposals/12883
**Extra data (please complete the following information):**
- Device: [e.g. iPhone6, Desktop]
- Device OS: [e.g. iOS8.1, Windows 10]
- Browser: [e.g. Chrome, Firefox, Safari]
- Decidim Version: [e.g. 0.10] 0.12.x
- Decidim installation: [e.g. MetaDecidim]
**Additional context**
Add any other context about the problem here.
|
1.0
|
Order processes on the processes page - **Describe the bug**
Process cards do not follow any deductible criteria when it comes to showing on the processes page. However, if you are logged in, they are displayed by up date creation.
I think they should always be displayed according to the same criterion, creation date, for example.
https://meta.decidim.org/processes/bug-report/f/210/proposals/12883
**Extra data (please complete the following information):**
- Device: [e.g. iPhone6, Desktop]
- Device OS: [e.g. iOS8.1, Windows 10]
- Browser: [e.g. Chrome, Firefox, Safari]
- Decidim Version: [e.g. 0.10] 0.12.x
- Decidim installation: [e.g. MetaDecidim]
**Additional context**
Add any other context about the problem here.
|
process
|
order processes on the processes page describe the bug process cards do not follow any deductible criteria when it comes to showing on the processes page however if you are logged in they are displayed by up date creation i think they should always be displayed according to the same criterion creation date for example extra data please complete the following information device device os browser decidim version x decidim installation additional context add any other context about the problem here
| 1
|
22,148
| 30,687,979,129
|
IssuesEvent
|
2023-07-26 13:32:21
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Limitation on runbook powershell capabilities.
|
automation/svc triaged cxp doc-enhancement process-automation/subsvc Pri1
|
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_functions_advanced_methods?view=powershell-7.3
You can't use begin, process, end to control your script follow.
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 8081200f-2bf4-db58-c957-c8ab7af5f90b
* Version Independent ID: b135cf1a-c391-03e5-41e7-e13571351e91
* Content: [Azure Automation runbook types](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-types?tabs=lps51%2Cpy27&source=docs)
* Content Source: [articles/automation/automation-runbook-types.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-types.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
1.0
|
Limitation on runbook powershell capabilities. - https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_functions_advanced_methods?view=powershell-7.3
You can't use begin, process, end to control your script follow.
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 8081200f-2bf4-db58-c957-c8ab7af5f90b
* Version Independent ID: b135cf1a-c391-03e5-41e7-e13571351e91
* Content: [Azure Automation runbook types](https://learn.microsoft.com/en-us/azure/automation/automation-runbook-types?tabs=lps51%2Cpy27&source=docs)
* Content Source: [articles/automation/automation-runbook-types.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/automation/automation-runbook-types.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @SnehaSudhirG
* Microsoft Alias: **sudhirsneha**
|
process
|
limitation on runbook powershell capabilities you can t use begin process end to control your script follow document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehasudhirg microsoft alias sudhirsneha
| 1
|
22,325
| 4,788,904,545
|
IssuesEvent
|
2016-10-30 20:11:03
|
swe574-group2/swe574
|
https://api.github.com/repos/swe574-group2/swe574
|
closed
|
Update meeting notes
|
documentation enhancement
|
Add the latest meeting notes (23/10/2016 - Sunday)
Add the latest course notes (24/10/2016 - Monday)
|
1.0
|
Update meeting notes - Add the latest meeting notes (23/10/2016 - Sunday)
Add the latest course notes (24/10/2016 - Monday)
|
non_process
|
update meeting notes add the latest meeting notes sunday add the latest course notes monday
| 0
|
8,403
| 11,568,859,821
|
IssuesEvent
|
2020-02-20 16:33:29
|
ESMValGroup/ESMValCore
|
https://api.github.com/repos/ESMValGroup/ESMValCore
|
reopened
|
Changing unit dimensionality in depth integration pre-processor
|
enhancement preprocessor
|
Depth integration is a common methods used to evalaute ocean behaviour, notably in the "integrated primary production" marine biogeochemistry metric. Depth integration typically converts concentration in volume into concentration in area. This means that the units go from mol m-3 to mol m-2, and so the units need to be changed.
This function already exists in the [volume preprocessor](https://github.com/ESMValGroup/ESMValTool/blob/version2_development/esmvaltool/preprocessor/_volume_pp.py), but does not currently change the units. Basically, I want to include this command in the preprocessor:
```
result.units = Unit('m') * result.units
```
but this does not work for me.
@bouweandela suggested in issue #604: _"This would require a minor modification to the code so the unit is read from the cube instead of from the cmor table of the input data when extracting the metadata, but should be possible. Please make an issue if you would like this functionality."_
So here is the issue. Fingers crossed it is actually easy to resolve!
|
1.0
|
Changing unit dimensionality in depth integration pre-processor - Depth integration is a common methods used to evalaute ocean behaviour, notably in the "integrated primary production" marine biogeochemistry metric. Depth integration typically converts concentration in volume into concentration in area. This means that the units go from mol m-3 to mol m-2, and so the units need to be changed.
This function already exists in the [volume preprocessor](https://github.com/ESMValGroup/ESMValTool/blob/version2_development/esmvaltool/preprocessor/_volume_pp.py), but does not currently change the units. Basically, I want to include this command in the preprocessor:
```
result.units = Unit('m') * result.units
```
but this does not work for me.
@bouweandela suggested in issue #604: _"This would require a minor modification to the code so the unit is read from the cube instead of from the cmor table of the input data when extracting the metadata, but should be possible. Please make an issue if you would like this functionality."_
So here is the issue. Fingers crossed it is actually easy to resolve!
|
process
|
changing unit dimensionality in depth integration pre processor depth integration is a common methods used to evalaute ocean behaviour notably in the integrated primary production marine biogeochemistry metric depth integration typically converts concentration in volume into concentration in area this means that the units go from mol m to mol m and so the units need to be changed this function already exists in the but does not currently change the units basically i want to include this command in the preprocessor result units unit m result units but this does not work for me bouweandela suggested in issue this would require a minor modification to the code so the unit is read from the cube instead of from the cmor table of the input data when extracting the metadata but should be possible please make an issue if you would like this functionality so here is the issue fingers crossed it is actually easy to resolve
| 1
|
4,870
| 7,752,751,045
|
IssuesEvent
|
2018-05-30 21:19:39
|
GoogleCloudPlatform/google-cloud-python
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
|
closed
|
BigTable: row_data _validateChunk performance
|
api: bigtable performance type: process
|
Looking at the performance graph for reading a large number (1000) of rows, each row having 10 cells in one column family, the validation of chunks takes a large percentage of time (>25%), as measured by cProfile. Areas for tuning are identified below.
**Validation of timestamp_micros ordering**
While a row is being read, a validation is made to make sure that cells for one qualifier are in timestamp_micros order (most recent first):
```
def _validate_chunk_row_in_progress(self, chunk):
...
previous = self._previous_cell
_raise_if(self._same_as_previous(chunk) and
chunk.timestamp_micros <= previous.timestamp_micros)
def _same_as_previous(self, chunk):
previous = self._previous_cell
return (chunk.row_key == previous.row_key and
chunk.family_name == previous.family_name and
chunk.qualifier == previous.qualifier and
chunk.labels == previous.labels)
```
This validation may be extraneous because:
1. A RowFilter may have been applied - CellsColumnLimitFilter(1) - which will limit the result to the cell with the most recent timestamp.
2. There may only be one cell value for the qualifier.
3. The BigTable server may ALWAYS be providing the cells in timestamp order.
Two possible tunings:
1. Remove the validation if the condition is unlikely to occur in the ReadRowResponse.
2. If the validation is kept, it could be performed only when qualifier cell lists with length > 1 are detected AFTER the chunks for the qualifier are consumed.
|
1.0
|
BigTable: row_data _validateChunk performance - Looking at the performance graph for reading a large number (1000) of rows, each row having 10 cells in one column family, the validation of chunks takes a large percentage of time (>25%), as measured by cProfile. Areas for tuning are identified below.
**Validation of timestamp_micros ordering**
While a row is being read, a validation is made to make sure that cells for one qualifier are in timestamp_micros order (most recent first):
```
def _validate_chunk_row_in_progress(self, chunk):
...
previous = self._previous_cell
_raise_if(self._same_as_previous(chunk) and
chunk.timestamp_micros <= previous.timestamp_micros)
def _same_as_previous(self, chunk):
previous = self._previous_cell
return (chunk.row_key == previous.row_key and
chunk.family_name == previous.family_name and
chunk.qualifier == previous.qualifier and
chunk.labels == previous.labels)
```
This validation may be extraneous because:
1. A RowFilter may have been applied - CellsColumnLimitFilter(1) - which will limit the result to the cell with the most recent timestamp.
2. There may only be one cell value for the qualifier.
3. The BigTable server may ALWAYS be providing the cells in timestamp order.
Two possible tunings:
1. Remove the validation if the condition is unlikely to occur in the ReadRowResponse.
2. If the validation is kept, it could be performed only when qualifier cell lists with length > 1 are detected AFTER the chunks for the qualifier are consumed.
|
process
|
bigtable row data validatechunk performance looking at the performance graph for reading a large number of rows each row having cells in one column family the validation of chunks takes a large percentage of time as measured by cprofile areas for tuning are identified below validation of timestamp micros ordering while a row is being read a validation is made to make sure that cells for one qualifier are in timestamp micros order most recent first def validate chunk row in progress self chunk previous self previous cell raise if self same as previous chunk and chunk timestamp micros previous timestamp micros def same as previous self chunk previous self previous cell return chunk row key previous row key and chunk family name previous family name and chunk qualifier previous qualifier and chunk labels previous labels this validation may be extraneous because a rowfilter may have been applied cellscolumnlimitfilter which will limit the result to the cell with the most recent timestamp there may only be one cell value for the qualifier the bigtable server may always be providing the cells in timestamp order two possible tunings remove the validation if the condition is unlikely to occur in the readrowresponse if the validation is kept it could be performed only when qualifier cell lists with length are detected after the chunks for the qualifier are consumed
| 1
|
14,798
| 18,074,581,026
|
IssuesEvent
|
2021-09-21 08:26:25
|
googleapis/google-cloud-dotnet
|
https://api.github.com/repos/googleapis/google-cloud-dotnet
|
closed
|
Make multi-API release groups more concrete
|
type: process
|
Both batch release and other release tools could be more powerful if we made the release groups for multiple APIs (e.g. Diagnostics, Spanner, OSLogin) clearer in the API catalog. This shouldn't be particularly hard to do.
|
1.0
|
Make multi-API release groups more concrete - Both batch release and other release tools could be more powerful if we made the release groups for multiple APIs (e.g. Diagnostics, Spanner, OSLogin) clearer in the API catalog. This shouldn't be particularly hard to do.
|
process
|
make multi api release groups more concrete both batch release and other release tools could be more powerful if we made the release groups for multiple apis e g diagnostics spanner oslogin clearer in the api catalog this shouldn t be particularly hard to do
| 1
|
1,866
| 4,697,347,556
|
IssuesEvent
|
2016-10-12 09:03:17
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
opened
|
If multiple parameters target a single SQL field filter variable only one is applied
|
Bug Correctness Query Processor
|
If you have two filter widgets on a dashboard and both of them are wired to the same SQL field filter variable and both are active, only one of them is actually applied to the query:
In this screenshot both widgets are wired to both cards, which are equivalent (GUI "Count of rows" of "Orders" and SQL `SELECT count(*) AS "count" FROM "PUBLIC"."ORDERS" WHERE {{created}}`) and should be returning the same results:
<img width="883" alt="screenshot 2016-10-12 01 58 13" src="https://cloud.githubusercontent.com/assets/18193/19303962/98cfa92e-901f-11e6-9ebb-ff259908246a.png">
Related to #3538
|
1.0
|
If multiple parameters target a single SQL field filter variable only one is applied - If you have two filter widgets on a dashboard and both of them are wired to the same SQL field filter variable and both are active, only one of them is actually applied to the query:
In this screenshot both widgets are wired to both cards, which are equivalent (GUI "Count of rows" of "Orders" and SQL `SELECT count(*) AS "count" FROM "PUBLIC"."ORDERS" WHERE {{created}}`) and should be returning the same results:
<img width="883" alt="screenshot 2016-10-12 01 58 13" src="https://cloud.githubusercontent.com/assets/18193/19303962/98cfa92e-901f-11e6-9ebb-ff259908246a.png">
Related to #3538
|
process
|
if multiple parameters target a single sql field filter variable only one is applied if you have two filter widgets on a dashboard and both of them are wired to the same sql field filter variable and both are active only one of them is actually applied to the query in this screenshot both widgets are wired to both cards which are equivalent gui count of rows of orders and sql select count as count from public orders where created and should be returning the same results img width alt screenshot src related to
| 1
|
86,657
| 17,033,882,719
|
IssuesEvent
|
2021-07-05 02:32:11
|
nothings/stb
|
https://api.github.com/repos/nothings/stb
|
closed
|
stb_vorbis.c: Dead assignment, initialization, Assigned value is garbage or undefined
|
1 stb_vorbis code quality
|
Hi , scan-build reported on lines
Dead store | Dead initialization | stb_vorbis.c | start_decoder | 3429 |
-- | -- | -- | -- | -- |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_stream_length_in_samples | 4497 |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_get_samples_short | 4830 |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_get_samples_short_interleaved | 4810 |
Logic error | Assigned value is garbage or undefined | stb_vorbis.c | get_seek_page_info | 4171 |
Logic error | Result of operation is garbage or undefined | stb_vorbis.c | get_seek_page_info | 4164 |
|
1.0
|
stb_vorbis.c: Dead assignment, initialization, Assigned value is garbage or undefined - Hi , scan-build reported on lines
Dead store | Dead initialization | stb_vorbis.c | start_decoder | 3429 |
-- | -- | -- | -- | -- |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_stream_length_in_samples | 4497 |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_get_samples_short | 4830 |
Dead store | Dead assignment | stb_vorbis.c | stb_vorbis_get_samples_short_interleaved | 4810 |
Logic error | Assigned value is garbage or undefined | stb_vorbis.c | get_seek_page_info | 4171 |
Logic error | Result of operation is garbage or undefined | stb_vorbis.c | get_seek_page_info | 4164 |
|
non_process
|
stb vorbis c dead assignment initialization assigned value is garbage or undefined hi scan build reported on lines dead store dead initialization stb vorbis c start decoder dead store dead assignment stb vorbis c stb vorbis stream length in samples dead store dead assignment stb vorbis c stb vorbis get samples short dead store dead assignment stb vorbis c stb vorbis get samples short interleaved logic error assigned value is garbage or undefined stb vorbis c get seek page info logic error result of operation is garbage or undefined stb vorbis c get seek page info
| 0
|
643
| 3,103,770,304
|
IssuesEvent
|
2015-08-31 12:22:56
|
processing/processing
|
https://api.github.com/repos/processing/processing
|
closed
|
Message in console when the code is commented
|
preprocessor
|
Processing 3.0b5 send message of error when the line is under comment like that
`````
/*
import japplemenubar.*;
*/
`````
but not when the comment is like that
``````
// import japplemenubar.*;
```````
I don't know if it's normal or not ?
|
1.0
|
Message in console when the code is commented - Processing 3.0b5 send message of error when the line is under comment like that
`````
/*
import japplemenubar.*;
*/
`````
but not when the comment is like that
``````
// import japplemenubar.*;
```````
I don't know if it's normal or not ?
|
process
|
message in console when the code is commented processing send message of error when the line is under comment like that import japplemenubar but not when the comment is like that import japplemenubar i don t know if it s normal or not
| 1
|
154,049
| 12,190,573,313
|
IssuesEvent
|
2020-04-29 09:33:18
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
opened
|
DISABLED test_default_quantized_lstm (quantization.test_quantize.TestPostTrainingDynamic)
|
topic: flaky-tests
|
This test is failing intermittently on multiple builds with increasing frequency recently and no clear cause.
```
Apr 29 09:14:17 ======================================================================
Apr 29 09:14:17 ERROR [0.030s]: test_default_quantized_lstm (quantization.test_quantize.TestPostTrainingDynamic)
Apr 29 09:14:17 ----------------------------------------------------------------------
Apr 29 09:14:17 Traceback (most recent call last):
Apr 29 09:14:17 File "/var/lib/jenkins/workspace/test/quantization/test_quantize.py", line 862, in test_default_quantized_lstm
Apr 29 09:14:17 y, (h, c) = cell_dq(x, (h, c))
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 562, in __call__
Apr 29 09:14:17 result = self.forward(*input, **kwargs)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 351, in forward
Apr 29 09:14:17 return self.forward_tensor(input, hx)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 312, in forward_tensor
Apr 29 09:14:17 input, hx, batch_sizes, max_batch_size, sorted_indices)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 293, in forward_impl
Apr 29 09:14:17 self.batch_first, dtype=self.dtype, use_dynamic=True)
Apr 29 09:14:17 RuntimeError: In ChooseQuantizationParams, min should be less than or equal to max
```
|
1.0
|
DISABLED test_default_quantized_lstm (quantization.test_quantize.TestPostTrainingDynamic) - This test is failing intermittently on multiple builds with increasing frequency recently and no clear cause.
```
Apr 29 09:14:17 ======================================================================
Apr 29 09:14:17 ERROR [0.030s]: test_default_quantized_lstm (quantization.test_quantize.TestPostTrainingDynamic)
Apr 29 09:14:17 ----------------------------------------------------------------------
Apr 29 09:14:17 Traceback (most recent call last):
Apr 29 09:14:17 File "/var/lib/jenkins/workspace/test/quantization/test_quantize.py", line 862, in test_default_quantized_lstm
Apr 29 09:14:17 y, (h, c) = cell_dq(x, (h, c))
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 562, in __call__
Apr 29 09:14:17 result = self.forward(*input, **kwargs)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 351, in forward
Apr 29 09:14:17 return self.forward_tensor(input, hx)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 312, in forward_tensor
Apr 29 09:14:17 input, hx, batch_sizes, max_batch_size, sorted_indices)
Apr 29 09:14:17 File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 293, in forward_impl
Apr 29 09:14:17 self.batch_first, dtype=self.dtype, use_dynamic=True)
Apr 29 09:14:17 RuntimeError: In ChooseQuantizationParams, min should be less than or equal to max
```
|
non_process
|
disabled test default quantized lstm quantization test quantize testposttrainingdynamic this test is failing intermittently on multiple builds with increasing frequency recently and no clear cause apr apr error test default quantized lstm quantization test quantize testposttrainingdynamic apr apr traceback most recent call last apr file var lib jenkins workspace test quantization test quantize py line in test default quantized lstm apr y h c cell dq x h c apr file opt conda lib site packages torch nn modules module py line in call apr result self forward input kwargs apr file opt conda lib site packages torch nn quantized dynamic modules rnn py line in forward apr return self forward tensor input hx apr file opt conda lib site packages torch nn quantized dynamic modules rnn py line in forward tensor apr input hx batch sizes max batch size sorted indices apr file opt conda lib site packages torch nn quantized dynamic modules rnn py line in forward impl apr self batch first dtype self dtype use dynamic true apr runtimeerror in choosequantizationparams min should be less than or equal to max
| 0
|
5,287
| 8,073,210,831
|
IssuesEvent
|
2018-08-06 18:29:39
|
GoogleCloudPlatform/google-cloud-python
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
|
closed
|
Storage: 'test_notification_*' systests flake with 503
|
api: storage flaky testing type: process
|
See: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/7563
```python
____________ TestStorageNotificationCRUD.test_notification_explicit ____________
self = <tests.system.TestStorageNotificationCRUD testMethod=test_notification_explicit>
def test_notification_explicit(self):
new_bucket_name = 'notification-explicit' + unique_resource_id('-')
bucket = retry_429(Config.CLIENT.create_bucket)(new_bucket_name)
self.case_buckets_to_delete.append(new_bucket_name)
notification = bucket.notification(
self.TOPIC_NAME,
custom_attributes=self.CUSTOM_ATTRIBUTES,
event_types=self.event_types(),
blob_name_prefix=self.BLOB_NAME_PREFIX,
payload_format=self.payload_format(),
)
> retry_429(notification.create)()
tests/system.py:962:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../.nox/sys-2-7/lib/python2.7/site-packages/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/storage/notification.py:247: in create
data=properties,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f162b7db490>
method = 'POST'
path = '/b/notification-explicit-7563-1533310638/notificationConfigs'
query_params = {}
data = '{"topic": "//pubsub.googleapis.com/projects/precise-truck-742/topics/notification-7563-1533310586", "object_name_pref...V1", "custom_attributes": {"attr2": "value2", "attr1": "value1"}, "event_types": ["OBJECT_FINALIZE", "OBJECT_DELETE"]}'
content_type = 'application/json', headers = None, api_base_url = None
api_version = None, expect_json = True, _target_object = None
...
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 POST https://www.googleapis.com/storage/v1/b/notification-explicit-7563-1533310638/notificationConfigs: Backend Error
../.nox/sys-2-7/lib/python2.7/site-packages/google/cloud/_http.py:293: ServiceUnavailable
```
and:
```python
____________ TestStorageNotificationCRUD.test_notification_minimal _____________
self = <tests.system.TestStorageNotificationCRUD testMethod=test_notification_minimal>
def test_notification_minimal(self):
new_bucket_name = 'notification-minimal' + unique_resource_id('-')
bucket = retry_429(Config.CLIENT.create_bucket)(new_bucket_name)
self.case_buckets_to_delete.append(new_bucket_name)
self.assertEqual(list(bucket.list_notifications()), [])
notification = bucket.notification(self.TOPIC_NAME)
> retry_429(notification.create)()
tests/system.py:941:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../.nox/sys-2-7/lib/python2.7/site-packages/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/storage/notification.py:247: in create
data=properties,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f162b7db490>
method = 'POST'
path = '/b/notification-minimal-7563-1533310644/notificationConfigs'
query_params = {}
data = '{"topic": "//pubsub.googleapis.com/projects/precise-truck-742/topics/notification-7563-1533310586", "payload_format": "NONE"}'
content_type = 'application/json', headers = None, api_base_url = None
api_version = None, expect_json = True, _target_object = None
...
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 POST https://www.googleapis.com/storage/v1/b/notification-minimal-7563-1533310644/notificationConfigs: Backend Error
```
|
1.0
|
Storage: 'test_notification_*' systests flake with 503 - See: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/7563
```python
____________ TestStorageNotificationCRUD.test_notification_explicit ____________
self = <tests.system.TestStorageNotificationCRUD testMethod=test_notification_explicit>
def test_notification_explicit(self):
new_bucket_name = 'notification-explicit' + unique_resource_id('-')
bucket = retry_429(Config.CLIENT.create_bucket)(new_bucket_name)
self.case_buckets_to_delete.append(new_bucket_name)
notification = bucket.notification(
self.TOPIC_NAME,
custom_attributes=self.CUSTOM_ATTRIBUTES,
event_types=self.event_types(),
blob_name_prefix=self.BLOB_NAME_PREFIX,
payload_format=self.payload_format(),
)
> retry_429(notification.create)()
tests/system.py:962:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../.nox/sys-2-7/lib/python2.7/site-packages/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/storage/notification.py:247: in create
data=properties,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f162b7db490>
method = 'POST'
path = '/b/notification-explicit-7563-1533310638/notificationConfigs'
query_params = {}
data = '{"topic": "//pubsub.googleapis.com/projects/precise-truck-742/topics/notification-7563-1533310586", "object_name_pref...V1", "custom_attributes": {"attr2": "value2", "attr1": "value1"}, "event_types": ["OBJECT_FINALIZE", "OBJECT_DELETE"]}'
content_type = 'application/json', headers = None, api_base_url = None
api_version = None, expect_json = True, _target_object = None
...
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 POST https://www.googleapis.com/storage/v1/b/notification-explicit-7563-1533310638/notificationConfigs: Backend Error
../.nox/sys-2-7/lib/python2.7/site-packages/google/cloud/_http.py:293: ServiceUnavailable
```
and:
```python
____________ TestStorageNotificationCRUD.test_notification_minimal _____________
self = <tests.system.TestStorageNotificationCRUD testMethod=test_notification_minimal>
def test_notification_minimal(self):
new_bucket_name = 'notification-minimal' + unique_resource_id('-')
bucket = retry_429(Config.CLIENT.create_bucket)(new_bucket_name)
self.case_buckets_to_delete.append(new_bucket_name)
self.assertEqual(list(bucket.list_notifications()), [])
notification = bucket.notification(self.TOPIC_NAME)
> retry_429(notification.create)()
tests/system.py:941:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../.nox/sys-2-7/lib/python2.7/site-packages/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/storage/notification.py:247: in create
data=properties,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <google.cloud.storage._http.Connection object at 0x7f162b7db490>
method = 'POST'
path = '/b/notification-minimal-7563-1533310644/notificationConfigs'
query_params = {}
data = '{"topic": "//pubsub.googleapis.com/projects/precise-truck-742/topics/notification-7563-1533310586", "payload_format": "NONE"}'
content_type = 'application/json', headers = None, api_base_url = None
api_version = None, expect_json = True, _target_object = None
...
if not 200 <= response.status_code < 300:
> raise exceptions.from_http_response(response)
E ServiceUnavailable: 503 POST https://www.googleapis.com/storage/v1/b/notification-minimal-7563-1533310644/notificationConfigs: Backend Error
```
|
process
|
storage test notification systests flake with see python teststoragenotificationcrud test notification explicit self def test notification explicit self new bucket name notification explicit unique resource id bucket retry config client create bucket new bucket name self case buckets to delete append new bucket name notification bucket notification self topic name custom attributes self custom attributes event types self event types blob name prefix self blob name prefix payload format self payload format retry notification create tests system py nox sys lib site packages test utils retry py in wrapped function return to wrap args kwargs google cloud storage notification py in create data properties self method post path b notification explicit notificationconfigs query params data topic pubsub googleapis com projects precise truck topics notification object name pref custom attributes event types content type application json headers none api base url none api version none expect json true target object none if not response status code raise exceptions from http response response e serviceunavailable post backend error nox sys lib site packages google cloud http py serviceunavailable and python teststoragenotificationcrud test notification minimal self def test notification minimal self new bucket name notification minimal unique resource id bucket retry config client create bucket new bucket name self case buckets to delete append new bucket name self assertequal list bucket list notifications notification bucket notification self topic name retry notification create tests system py nox sys lib site packages test utils retry py in wrapped function return to wrap args kwargs google cloud storage notification py in create data properties self method post path b notification minimal notificationconfigs query params data topic pubsub googleapis com projects precise truck topics notification payload format none content type application json headers none api base url none api version none expect json true target object none if not response status code raise exceptions from http response response e serviceunavailable post backend error
| 1
|
16,084
| 20,253,843,128
|
IssuesEvent
|
2022-02-14 20:46:15
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Oracle fails queries on joins with tables with long display names
|
Type:Bug Priority:P2 Querying/Processor Database/Oracle .Regression
|
**Describe the bug**
On Oracle, when making a join to a table with a long display name (which combined with the FK field display name is >30 bytes), then the query fails since 0.38.0
**Possible workaround**: Change table and column names to something shorter in Admin > Data Model.
Also recommended to disable or underscore "Friendly Table and Field Names" in Admin > Settings > General.
**To Reproduce**
1. Admin > Data Model > (Oracle with Sample) > Products > change name to `Products-veeerylong-veeerylong-veeerylong`
2. Custom question > (Oracle with Sample) > Reviews - join the Products table

3. Oracle 12.2+ fails with `ORA-00904: "Products-veeerylong-veeerylong-veeerylong"."ID": invalid identifier` because it's incorrectly using display name instead of alias in the join-clause.
Oracle pre-12.2 fails with `ORA-00972: identifier is too long`, since it doesn't support table/column/alias with more than 30 bytes, which the fix in 0.38.0 likely tried to solve, but caused breaking long table joins on all versions of Oracle.
<details><summary>Full stacktrace</summary>
```
2021-05-09 14:50:54,124 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 48,
:started_at #t "2021-05-09T14:50:53.045846+02:00[Europe/Copenhagen]",
:via
[{:status :failed,
:class java.sql.SQLSyntaxErrorException,
:error "ORA-00904: \"Products-veeerylong-veeerylong-veeerylong\".\"ID\": invalid identifier\n",
:stacktrace
["oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509)"
"oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461)"
"oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104)"
"oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:553)"
"oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:269)"
"oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91)"
"oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:807)"
"oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:983)"
"oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168)"
"oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666)"
"oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426)"
"oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713)"
"oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)"
"--> driver.sql_jdbc.execute$fn__77454.invokeStatic(execute.clj:264)"
"driver.sql_jdbc.execute$fn__77454.invoke(execute.clj:262)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:383)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc$fn__78967.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__78967.invoke(sql_jdbc.clj:52)"
"driver.oracle$eval840$fn__841.invoke(oracle.clj:294)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__46313.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__45589.invoke(check_features.clj:41)"
"query_processor.middleware.limit$limit$fn__46299.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45240.invoke(cache.clj:211)"
"query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__46478.invoke(optimize_datetime_filters.clj:133)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__44386.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40767.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40630.invoke(annotate.clj:582)"
"query_processor.middleware.permissions$check_query_permissions$fn__45464.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47001.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__45662.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47314.invoke(resolve_joined_fields.clj:94)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__47619.invoke(resolve_joins.clj:178)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__43976.invoke(add_implicit_joins.clj:181)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__46274.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__46254.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__45728.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__44746.invoke(binning.clj:228)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45264.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__43606.invoke(add_dimension_projections.clj:316)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__43837.invoke(add_implicit_clauses.clj:146)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48049.invoke(upgrade_field_literals.clj:45)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44133.invoke(add_source_metadata.clj:124)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47198.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44333.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45311.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__46983.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__45363.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__45984.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44142.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47985.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209$fn__47213.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46202.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__47994$fn__47995.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__47994.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48056.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__46326.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__43994.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47970.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__45605.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47072.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__45548.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37408$thunk__37409.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37408.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37417$fn__37420.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37417.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:237)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:233)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111$fn__48114.invoke(query_processor.clj:249)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111.invoke(query_processor.clj:241)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155$fn__48158.invoke(query_processor.clj:261)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155.invoke(query_processor.clj:254)"
"api.dataset$fn__54256$fn__54259.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__54237$fn__54238.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__54237.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16055.invoke(streaming_response.clj:84)"],
:state "42000"}],
:json_query
{:type "query",
:query
{:source-table 6347,
:joins
[{:fields "all",
:source-table 6366,
:condition ["=" ["field-id" 36480] ["joined-field" "Products-veeerylong-veeerylong-veeerylong" ["field-id" 36609]]],
:alias "Products-veeerylong-veeerylong-veeerylong"}]},
:database 48,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native
{:query
"SELECT * FROM (SELECT \"SAMPLE_DATASET\".\"REVIEWS\".\"ID\" AS \"ID\", \"SAMPLE_DATASET\".\"REVIEWS\".\"BODY\" AS \"BODY\", \"SAMPLE_DATASET\".\"REVIEWS\".\"CREATED_AT\" AS \"CREATED_AT\", \"SAMPLE_DATASET\".\"REVIEWS\".\"PRODUCT_ID\" AS \"PRODUCT_ID\", \"SAMPLE_DATASET\".\"REVIEWS\".\"RATING\" AS \"RATING\", \"SAMPLE_DATASET\".\"REVIEWS\".\"REVIEWER\" AS \"REVIEWER\", \"Products-veeerylong-veeerylong-veeerylong\".\"ID\" AS \"identifier_nrxrxrruxuu\", \"Products-veeerylong-veeerylong-veeerylong\".\"CATEGORY\" AS \"identifier_nrwyyrqqtxy\", \"Products-veeerylong-veeerylong-veeerylong\".\"CREATED_AT\" AS \"identifier_rxwvtuwxqr\", \"Products-veeerylong-veeerylong-veeerylong\".\"EAN\" AS \"identifier_nrqtzqxwszr\", \"Products-veeerylong-veeerylong-veeerylong\".\"PRICE\" AS \"identifier_rzrwqtuuxq\", \"Products-veeerylong-veeerylong-veeerylong\".\"RATING\" AS \"identifier_ruzqtrzwux\", \"Products-veeerylong-veeerylong-veeerylong\".\"TITLE\" AS \"identifier_nrxxxtvqxtw\", \"Products-veeerylong-veeerylong-veeerylong\".\"VENDOR\" AS \"identifier_nyvwqvvtur\" FROM \"SAMPLE_DATASET\".\"REVIEWS\" LEFT JOIN \"SAMPLE_DATASET\".\"PRODUCTS\" \"identifier_nyurvuxrux\" ON \"SAMPLE_DATASET\".\"REVIEWS\".\"PRODUCT_ID\" = \"Products-veeerylong-veeerylong-veeerylong\".\"ID\") WHERE rownum <= 2000",
:params nil},
:status :failed,
:class oracle.jdbc.OracleDatabaseException,
:stacktrace
["oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:513)"
"oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461)"
"oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104)"
"oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:553)"
"oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:269)"
"oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91)"
"oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:807)"
"oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:983)"
"oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168)"
"oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666)"
"oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426)"
"oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713)"
"oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)"
"--> driver.sql_jdbc.execute$fn__77454.invokeStatic(execute.clj:264)"
"driver.sql_jdbc.execute$fn__77454.invoke(execute.clj:262)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:383)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc$fn__78967.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__78967.invoke(sql_jdbc.clj:52)"
"driver.oracle$eval840$fn__841.invoke(oracle.clj:294)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__46313.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__45589.invoke(check_features.clj:41)"
"query_processor.middleware.limit$limit$fn__46299.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45240.invoke(cache.clj:211)"
"query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__46478.invoke(optimize_datetime_filters.clj:133)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__44386.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40767.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40630.invoke(annotate.clj:582)"
"query_processor.middleware.permissions$check_query_permissions$fn__45464.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47001.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__45662.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47314.invoke(resolve_joined_fields.clj:94)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__47619.invoke(resolve_joins.clj:178)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__43976.invoke(add_implicit_joins.clj:181)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__46274.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__46254.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__45728.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__44746.invoke(binning.clj:228)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45264.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__43606.invoke(add_dimension_projections.clj:316)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__43837.invoke(add_implicit_clauses.clj:146)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48049.invoke(upgrade_field_literals.clj:45)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44133.invoke(add_source_metadata.clj:124)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47198.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44333.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45311.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__46983.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__45363.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__45984.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44142.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47985.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209$fn__47213.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46202.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__47994$fn__47995.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__47994.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48056.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__46326.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__43994.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47970.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__45605.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47072.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__45548.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37408$thunk__37409.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37408.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37417$fn__37420.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37417.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:237)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:233)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111$fn__48114.invoke(query_processor.clj:249)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111.invoke(query_processor.clj:241)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155$fn__48158.invoke(query_processor.clj:261)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155.invoke(query_processor.clj:254)"
"api.dataset$fn__54256$fn__54259.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__54237$fn__54238.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__54237.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16055.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "ORA-00904: \"Products-veeerylong-veeerylong-veeerylong\".\"ID\": invalid identifier\n",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 6347,
:fields
[[:field-id 36482]
[:field-id 36485]
[:datetime-field [:field-id 36483] :default]
[:field-id 36480]
[:field-id 36481]
[:field-id 36484]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36609]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36615]]
[:datetime-field [:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36610]] :default]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36613]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36614]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36608]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36611]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36612]]],
:joins
[{:strategy :left-join,
:source-table 6366,
:condition [:= [:field-id 36480] [:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36609]]],
:alias "Products-veeerylong-veeerylong-veeerylong"}],
:limit 2000},
:database 48,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash [17, -8, 20, -94, 86, -57, 88, -49, 120, -86, -35, 109, 32, 3, -6, 41, -4, 88, -75, -88, -122, 104, 15, -82, -35, 26, 117, 49, 39, -21, -76, -14]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2021-05-09 14:50:54,194 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 1.2 s (19 DB calls) App DB connections: 0/7 Jetty threads: 3/50 (5 idle, 0 queued) (70 total active threads) Queries in flight: 1 (0 queued)
```
</details>
**Information about your Metabase Installation:**
Tested 0.37.8 thru 0.39.1 - works on 0.37.8 (for Oracle 12.2+, would always fail pre-12.2), regression since 0.38.0
Oracle has finally "officially" released Docker images a couple of days ago: https://hub.docker.com/r/gvenzl/oracle-xe
As an alternative to the ageing 11g that most have used thanks to: https://hub.docker.com/r/wnameless/oracle-xe-11g-r2
:arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
|
1.0
|
Oracle fails queries on joins with tables with long display names - **Describe the bug**
On Oracle, when making a join to a table with a long display name (which combined with the FK field display name is >30 bytes), then the query fails since 0.38.0
**Possible workaround**: Change table and column names to something shorter in Admin > Data Model.
Also recommended to disable or underscore "Friendly Table and Field Names" in Admin > Settings > General.
**To Reproduce**
1. Admin > Data Model > (Oracle with Sample) > Products > change name to `Products-veeerylong-veeerylong-veeerylong`
2. Custom question > (Oracle with Sample) > Reviews - join the Products table

3. Oracle 12.2+ fails with `ORA-00904: "Products-veeerylong-veeerylong-veeerylong"."ID": invalid identifier` because it's incorrectly using display name instead of alias in the join-clause.
Oracle pre-12.2 fails with `ORA-00972: identifier is too long`, since it doesn't support table/column/alias with more than 30 bytes, which the fix in 0.38.0 likely tried to solve, but caused breaking long table joins on all versions of Oracle.
<details><summary>Full stacktrace</summary>
```
2021-05-09 14:50:54,124 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 48,
:started_at #t "2021-05-09T14:50:53.045846+02:00[Europe/Copenhagen]",
:via
[{:status :failed,
:class java.sql.SQLSyntaxErrorException,
:error "ORA-00904: \"Products-veeerylong-veeerylong-veeerylong\".\"ID\": invalid identifier\n",
:stacktrace
["oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:509)"
"oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461)"
"oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104)"
"oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:553)"
"oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:269)"
"oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91)"
"oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:807)"
"oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:983)"
"oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168)"
"oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666)"
"oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426)"
"oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713)"
"oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)"
"--> driver.sql_jdbc.execute$fn__77454.invokeStatic(execute.clj:264)"
"driver.sql_jdbc.execute$fn__77454.invoke(execute.clj:262)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:383)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc$fn__78967.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__78967.invoke(sql_jdbc.clj:52)"
"driver.oracle$eval840$fn__841.invoke(oracle.clj:294)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__46313.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__45589.invoke(check_features.clj:41)"
"query_processor.middleware.limit$limit$fn__46299.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45240.invoke(cache.clj:211)"
"query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__46478.invoke(optimize_datetime_filters.clj:133)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__44386.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40767.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40630.invoke(annotate.clj:582)"
"query_processor.middleware.permissions$check_query_permissions$fn__45464.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47001.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__45662.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47314.invoke(resolve_joined_fields.clj:94)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__47619.invoke(resolve_joins.clj:178)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__43976.invoke(add_implicit_joins.clj:181)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__46274.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__46254.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__45728.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__44746.invoke(binning.clj:228)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45264.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__43606.invoke(add_dimension_projections.clj:316)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__43837.invoke(add_implicit_clauses.clj:146)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48049.invoke(upgrade_field_literals.clj:45)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44133.invoke(add_source_metadata.clj:124)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47198.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44333.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45311.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__46983.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__45363.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__45984.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44142.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47985.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209$fn__47213.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46202.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__47994$fn__47995.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__47994.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48056.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__46326.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__43994.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47970.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__45605.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47072.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__45548.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37408$thunk__37409.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37408.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37417$fn__37420.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37417.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:237)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:233)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111$fn__48114.invoke(query_processor.clj:249)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111.invoke(query_processor.clj:241)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155$fn__48158.invoke(query_processor.clj:261)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155.invoke(query_processor.clj:254)"
"api.dataset$fn__54256$fn__54259.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__54237$fn__54238.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__54237.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16055.invoke(streaming_response.clj:84)"],
:state "42000"}],
:json_query
{:type "query",
:query
{:source-table 6347,
:joins
[{:fields "all",
:source-table 6366,
:condition ["=" ["field-id" 36480] ["joined-field" "Products-veeerylong-veeerylong-veeerylong" ["field-id" 36609]]],
:alias "Products-veeerylong-veeerylong-veeerylong"}]},
:database 48,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native
{:query
"SELECT * FROM (SELECT \"SAMPLE_DATASET\".\"REVIEWS\".\"ID\" AS \"ID\", \"SAMPLE_DATASET\".\"REVIEWS\".\"BODY\" AS \"BODY\", \"SAMPLE_DATASET\".\"REVIEWS\".\"CREATED_AT\" AS \"CREATED_AT\", \"SAMPLE_DATASET\".\"REVIEWS\".\"PRODUCT_ID\" AS \"PRODUCT_ID\", \"SAMPLE_DATASET\".\"REVIEWS\".\"RATING\" AS \"RATING\", \"SAMPLE_DATASET\".\"REVIEWS\".\"REVIEWER\" AS \"REVIEWER\", \"Products-veeerylong-veeerylong-veeerylong\".\"ID\" AS \"identifier_nrxrxrruxuu\", \"Products-veeerylong-veeerylong-veeerylong\".\"CATEGORY\" AS \"identifier_nrwyyrqqtxy\", \"Products-veeerylong-veeerylong-veeerylong\".\"CREATED_AT\" AS \"identifier_rxwvtuwxqr\", \"Products-veeerylong-veeerylong-veeerylong\".\"EAN\" AS \"identifier_nrqtzqxwszr\", \"Products-veeerylong-veeerylong-veeerylong\".\"PRICE\" AS \"identifier_rzrwqtuuxq\", \"Products-veeerylong-veeerylong-veeerylong\".\"RATING\" AS \"identifier_ruzqtrzwux\", \"Products-veeerylong-veeerylong-veeerylong\".\"TITLE\" AS \"identifier_nrxxxtvqxtw\", \"Products-veeerylong-veeerylong-veeerylong\".\"VENDOR\" AS \"identifier_nyvwqvvtur\" FROM \"SAMPLE_DATASET\".\"REVIEWS\" LEFT JOIN \"SAMPLE_DATASET\".\"PRODUCTS\" \"identifier_nyurvuxrux\" ON \"SAMPLE_DATASET\".\"REVIEWS\".\"PRODUCT_ID\" = \"Products-veeerylong-veeerylong-veeerylong\".\"ID\") WHERE rownum <= 2000",
:params nil},
:status :failed,
:class oracle.jdbc.OracleDatabaseException,
:stacktrace
["oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:513)"
"oracle.jdbc.driver.T4CTTIoer11.processError(T4CTTIoer11.java:461)"
"oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1104)"
"oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:553)"
"oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:269)"
"oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:655)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:270)"
"oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:91)"
"oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:807)"
"oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:983)"
"oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1168)"
"oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3666)"
"oracle.jdbc.driver.T4CPreparedStatement.executeInternal(T4CPreparedStatement.java:1426)"
"oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3713)"
"oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1167)"
"com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeQuery(NewProxyPreparedStatement.java:431)"
"--> driver.sql_jdbc.execute$fn__77454.invokeStatic(execute.clj:264)"
"driver.sql_jdbc.execute$fn__77454.invoke(execute.clj:262)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:389)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc.execute$execute_reducible_query.invokeStatic(execute.clj:383)"
"driver.sql_jdbc.execute$execute_reducible_query.invoke(execute.clj:374)"
"driver.sql_jdbc$fn__78967.invokeStatic(sql_jdbc.clj:54)"
"driver.sql_jdbc$fn__78967.invoke(sql_jdbc.clj:52)"
"driver.oracle$eval840$fn__841.invoke(oracle.clj:294)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__46313.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__45589.invoke(check_features.clj:41)"
"query_processor.middleware.limit$limit$fn__46299.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45240.invoke(cache.clj:211)"
"query_processor.middleware.optimize_datetime_filters$optimize_datetime_filters$fn__46478.invoke(optimize_datetime_filters.clj:133)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__44386.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__40767.invoke(wrap_value_literals.clj:147)"
"query_processor.middleware.annotate$add_column_info$fn__40630.invoke(annotate.clj:582)"
"query_processor.middleware.permissions$check_query_permissions$fn__45464.invoke(permissions.clj:69)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__47001.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__45662.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47314.invoke(resolve_joined_fields.clj:94)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__47619.invoke(resolve_joins.clj:178)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__43976.invoke(add_implicit_joins.clj:181)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__46274.invoke(large_int_id.clj:44)"
"query_processor.middleware.format_rows$format_rows$fn__46254.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__45728.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__44746.invoke(binning.clj:228)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45264.invoke(resolve_fields.clj:24)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__43606.invoke(add_dimension_projections.clj:316)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__43837.invoke(add_implicit_clauses.clj:146)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48049.invoke(upgrade_field_literals.clj:45)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44133.invoke(add_source_metadata.clj:124)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__47198.invoke(reconcile_breakout_and_order_by_bucketing.clj:97)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44333.invoke(auto_bucket_datetimes.clj:139)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45311.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__46983.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__45363.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__45984.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44142.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__47985.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209$fn__47213.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__47209.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46202.invoke(fetch_source_query.clj:264)"
"query_processor.middleware.store$initialize_store$fn__47994$fn__47995.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__47994.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__48056.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__46326.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__43994.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__47970.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__45605.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__47072.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__45548.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37408$thunk__37409.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37408.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37417$fn__37420.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37417.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:237)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:233)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111$fn__48114.invoke(query_processor.clj:249)"
"query_processor$fn__48102$process_query_and_save_execution_BANG___48111.invoke(query_processor.clj:241)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155$fn__48158.invoke(query_processor.clj:261)"
"query_processor$fn__48146$process_query_and_save_with_max_results_constraints_BANG___48155.invoke(query_processor.clj:254)"
"api.dataset$fn__54256$fn__54259.invoke(dataset.clj:55)"
"query_processor.streaming$streaming_response_STAR_$fn__54237$fn__54238.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__54237.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16055.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "ORA-00904: \"Products-veeerylong-veeerylong-veeerylong\".\"ID\": invalid identifier\n",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 6347,
:fields
[[:field-id 36482]
[:field-id 36485]
[:datetime-field [:field-id 36483] :default]
[:field-id 36480]
[:field-id 36481]
[:field-id 36484]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36609]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36615]]
[:datetime-field [:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36610]] :default]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36613]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36614]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36608]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36611]]
[:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36612]]],
:joins
[{:strategy :left-join,
:source-table 6366,
:condition [:= [:field-id 36480] [:joined-field "Products-veeerylong-veeerylong-veeerylong" [:field-id 36609]]],
:alias "Products-veeerylong-veeerylong-veeerylong"}],
:limit 2000},
:database 48,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash [17, -8, 20, -94, 86, -57, 88, -49, 120, -86, -35, 109, 32, 3, -6, 41, -4, 88, -75, -88, -122, 104, 15, -82, -35, 26, 117, 49, 39, -21, -76, -14]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2021-05-09 14:50:54,194 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 1.2 s (19 DB calls) App DB connections: 0/7 Jetty threads: 3/50 (5 idle, 0 queued) (70 total active threads) Queries in flight: 1 (0 queued)
```
</details>
**Information about your Metabase Installation:**
Tested 0.37.8 thru 0.39.1 - works on 0.37.8 (for Oracle 12.2+, would always fail pre-12.2), regression since 0.38.0
Oracle has finally "officially" released Docker images a couple of days ago: https://hub.docker.com/r/gvenzl/oracle-xe
As an alternative to the ageing 11g that most have used thanks to: https://hub.docker.com/r/wnameless/oracle-xe-11g-r2
:arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
|
process
|
oracle fails queries on joins with tables with long display names describe the bug on oracle when making a join to a table with a long display name which combined with the fk field display name is bytes then the query fails since possible workaround change table and column names to something shorter in admin data model also recommended to disable or underscore friendly table and field names in admin settings general to reproduce admin data model oracle with sample products change name to products veeerylong veeerylong veeerylong custom question oracle with sample reviews join the products table oracle fails with ora products veeerylong veeerylong veeerylong id invalid identifier because it s incorrectly using display name instead of alias in the join clause oracle pre fails with ora identifier is too long since it doesn t support table column alias with more than bytes which the fix in likely tried to solve but caused breaking long table joins on all versions of oracle full stacktrace error middleware catch exceptions error processing query null database id started at t via status failed class java sql sqlsyntaxerrorexception error ora products veeerylong veeerylong veeerylong id invalid identifier n stacktrace oracle jdbc driver processerror java oracle jdbc driver processerror java oracle jdbc driver processerror java oracle jdbc driver receive java oracle jdbc driver dorpc java oracle jdbc driver dooall java oracle jdbc driver java oracle jdbc driver java oracle jdbc driver executefordescribe java oracle jdbc driver oraclestatement executemaybedescribe oraclestatement java oracle jdbc driver oraclestatement doexecutewithtimeout oraclestatement java oracle jdbc driver oraclepreparedstatement executeinternal oraclepreparedstatement java oracle jdbc driver executeinternal java oracle jdbc driver oraclepreparedstatement executequery oraclepreparedstatement java oracle jdbc driver oraclepreparedstatementwrapper executequery oraclepreparedstatementwrapper java com mchange impl newproxypreparedstatement executequery newproxypreparedstatement java driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver oracle fn invoke oracle clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize datetime filters optimize datetime filters fn invoke optimize datetime filters clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset fn fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj state json query type query query source table joins fields all source table condition alias products veeerylong veeerylong veeerylong database parameters middleware js int to string true add default userland constraints true native query select from select sample dataset reviews id as id sample dataset reviews body as body sample dataset reviews created at as created at sample dataset reviews product id as product id sample dataset reviews rating as rating sample dataset reviews reviewer as reviewer products veeerylong veeerylong veeerylong id as identifier nrxrxrruxuu products veeerylong veeerylong veeerylong category as identifier nrwyyrqqtxy products veeerylong veeerylong veeerylong created at as identifier rxwvtuwxqr products veeerylong veeerylong veeerylong ean as identifier nrqtzqxwszr products veeerylong veeerylong veeerylong price as identifier rzrwqtuuxq products veeerylong veeerylong veeerylong rating as identifier ruzqtrzwux products veeerylong veeerylong veeerylong title as identifier nrxxxtvqxtw products veeerylong veeerylong veeerylong vendor as identifier nyvwqvvtur from sample dataset reviews left join sample dataset products identifier nyurvuxrux on sample dataset reviews product id products veeerylong veeerylong veeerylong id where rownum params nil status failed class oracle jdbc oracledatabaseexception stacktrace oracle jdbc driver processerror java oracle jdbc driver processerror java oracle jdbc driver processerror java oracle jdbc driver receive java oracle jdbc driver dorpc java oracle jdbc driver dooall java oracle jdbc driver java oracle jdbc driver java oracle jdbc driver executefordescribe java oracle jdbc driver oraclestatement executemaybedescribe oraclestatement java oracle jdbc driver oraclestatement doexecutewithtimeout oraclestatement java oracle jdbc driver oraclepreparedstatement executeinternal oraclepreparedstatement java oracle jdbc driver executeinternal java oracle jdbc driver oraclepreparedstatement executequery oraclepreparedstatement java oracle jdbc driver oraclepreparedstatementwrapper executequery oraclepreparedstatementwrapper java com mchange impl newproxypreparedstatement executequery newproxypreparedstatement java driver sql jdbc execute fn invokestatic execute clj driver sql jdbc execute fn invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc execute execute reducible query invokestatic execute clj driver sql jdbc execute execute reducible query invoke execute clj driver sql jdbc fn invokestatic sql jdbc clj driver sql jdbc fn invoke sql jdbc clj driver oracle fn invoke oracle clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize datetime filters optimize datetime filters fn invoke optimize datetime filters clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset fn fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error ora products veeerylong veeerylong veeerylong id invalid identifier n row count running time preprocessed type query query source table fields default default joins strategy left join source table condition alias products veeerylong veeerylong veeerylong limit database middleware js int to string true add default userland constraints true info executed by context ad hoc nested false query hash constraints max results max results bare rows data rows cols debug middleware log post api dataset s db calls app db connections jetty threads idle queued total active threads queries in flight queued information about your metabase installation tested thru works on for oracle would always fail pre regression since oracle has finally officially released docker images a couple of days ago as an alternative to the ageing that most have used thanks to arrow down please click the reaction instead of leaving a or update comment
| 1
|
20,905
| 27,747,051,108
|
IssuesEvent
|
2023-03-15 17:44:44
|
digitalmethodsinitiative/4cat
|
https://api.github.com/repos/digitalmethodsinitiative/4cat
|
opened
|
Add tokenisation options for more languages
|
enhancement processors
|
Some languages, in particular East-Asian ones, don't use spaces to separate words, so the standard NLTK tokeniser doesn't work for them. It is likely that there are many languages for which this is an issue but the East-Asian ones are probably the most pressing because they represent a large number of people online.
Support for Chinese tokenisation has been added using [jieba](https://github.com/fxsjy/jieba). There are other languages to consider, [here is a nice overview](https://investigate.ai/text-analysis/splitting-words-in-east-asian-languages/). But the libraries listed there all have dependencies that make them difficult to install, so more work is needed to figure out how to best make them install with 4CAT.
|
1.0
|
Add tokenisation options for more languages - Some languages, in particular East-Asian ones, don't use spaces to separate words, so the standard NLTK tokeniser doesn't work for them. It is likely that there are many languages for which this is an issue but the East-Asian ones are probably the most pressing because they represent a large number of people online.
Support for Chinese tokenisation has been added using [jieba](https://github.com/fxsjy/jieba). There are other languages to consider, [here is a nice overview](https://investigate.ai/text-analysis/splitting-words-in-east-asian-languages/). But the libraries listed there all have dependencies that make them difficult to install, so more work is needed to figure out how to best make them install with 4CAT.
|
process
|
add tokenisation options for more languages some languages in particular east asian ones don t use spaces to separate words so the standard nltk tokeniser doesn t work for them it is likely that there are many languages for which this is an issue but the east asian ones are probably the most pressing because they represent a large number of people online support for chinese tokenisation has been added using there are other languages to consider but the libraries listed there all have dependencies that make them difficult to install so more work is needed to figure out how to best make them install with
| 1
|
263,398
| 23,054,369,443
|
IssuesEvent
|
2022-07-25 02:05:57
|
apache/pulsar
|
https://api.github.com/repos/apache/pulsar
|
closed
|
Flaky-test: org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.testAutoSubscribePatternConsumer
|
component/test flaky-tests
|
```
Error: testAutoSubscribePatternConsumer(org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest) Time elapsed: 11.211 s <<< FAILURE!
org.awaitility.core.ConditionTimeoutException: Assertion condition defined as a org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest expected [10] but found [6] within 10 seconds.
at org.awaitility.core.ConditionAwaiter.await(ConditionAwaiter.java:167)
at org.awaitility.core.AssertionCondition.await(AssertionCondition.java:119)
at org.awaitility.core.AssertionCondition.await(AssertionCondition.java:31)
at org.awaitility.core.ConditionFactory.until(ConditionFactory.java:985)
at org.awaitility.core.ConditionFactory.untilAsserted(ConditionFactory.java:769)
at org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.testAutoSubscribePatternConsumer(PatternTopicsConsumerImplTest.java:591)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:132)
at org.testng.internal.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:45)
at org.testng.internal.InvokeMethodRunnable.call(InvokeMethodRunnable.java:73)
at org.testng.internal.InvokeMethodRunnable.call(InvokeMethodRunnable.java:11)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.AssertionError: expected [10] but found [6]
at org.testng.Assert.fail(Assert.java:99)
at org.testng.Assert.failNotEquals(Assert.java:1037)
at org.testng.Assert.assertEqualsImpl(Assert.java:140)
at org.testng.Assert.assertEquals(Assert.java:122)
at org.testng.Assert.assertEquals(Assert.java:907)
at org.testng.Assert.assertEquals(Assert.java:917)
at org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.lambda$testAutoSubscribePatternConsumer$12(PatternTopicsConsumerImplTest.java:592)
at org.awaitility.core.AssertionCondition.lambda$new$0(AssertionCondition.java:53)
at org.awaitility.core.ConditionAwaiter$ConditionPoller.call(ConditionAwaiter.java:248)
at org.awaitility.core.ConditionAwaiter$ConditionPoller.call(ConditionAwaiter.java:235)
... 4 more
````
|
2.0
|
Flaky-test: org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.testAutoSubscribePatternConsumer - ```
Error: testAutoSubscribePatternConsumer(org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest) Time elapsed: 11.211 s <<< FAILURE!
org.awaitility.core.ConditionTimeoutException: Assertion condition defined as a org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest expected [10] but found [6] within 10 seconds.
at org.awaitility.core.ConditionAwaiter.await(ConditionAwaiter.java:167)
at org.awaitility.core.AssertionCondition.await(AssertionCondition.java:119)
at org.awaitility.core.AssertionCondition.await(AssertionCondition.java:31)
at org.awaitility.core.ConditionFactory.until(ConditionFactory.java:985)
at org.awaitility.core.ConditionFactory.untilAsserted(ConditionFactory.java:769)
at org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.testAutoSubscribePatternConsumer(PatternTopicsConsumerImplTest.java:591)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:132)
at org.testng.internal.InvokeMethodRunnable.runOne(InvokeMethodRunnable.java:45)
at org.testng.internal.InvokeMethodRunnable.call(InvokeMethodRunnable.java:73)
at org.testng.internal.InvokeMethodRunnable.call(InvokeMethodRunnable.java:11)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: java.lang.AssertionError: expected [10] but found [6]
at org.testng.Assert.fail(Assert.java:99)
at org.testng.Assert.failNotEquals(Assert.java:1037)
at org.testng.Assert.assertEqualsImpl(Assert.java:140)
at org.testng.Assert.assertEquals(Assert.java:122)
at org.testng.Assert.assertEquals(Assert.java:907)
at org.testng.Assert.assertEquals(Assert.java:917)
at org.apache.pulsar.client.impl.PatternTopicsConsumerImplTest.lambda$testAutoSubscribePatternConsumer$12(PatternTopicsConsumerImplTest.java:592)
at org.awaitility.core.AssertionCondition.lambda$new$0(AssertionCondition.java:53)
at org.awaitility.core.ConditionAwaiter$ConditionPoller.call(ConditionAwaiter.java:248)
at org.awaitility.core.ConditionAwaiter$ConditionPoller.call(ConditionAwaiter.java:235)
... 4 more
````
|
non_process
|
flaky test org apache pulsar client impl patterntopicsconsumerimpltest testautosubscribepatternconsumer error testautosubscribepatternconsumer org apache pulsar client impl patterntopicsconsumerimpltest time elapsed s failure org awaitility core conditiontimeoutexception assertion condition defined as a org apache pulsar client impl patterntopicsconsumerimpltest expected but found within seconds at org awaitility core conditionawaiter await conditionawaiter java at org awaitility core assertioncondition await assertioncondition java at org awaitility core assertioncondition await assertioncondition java at org awaitility core conditionfactory until conditionfactory java at org awaitility core conditionfactory untilasserted conditionfactory java at org apache pulsar client impl patterntopicsconsumerimpltest testautosubscribepatternconsumer patterntopicsconsumerimpltest java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invokemethodrunnable runone invokemethodrunnable java at org testng internal invokemethodrunnable call invokemethodrunnable java at org testng internal invokemethodrunnable call invokemethodrunnable java at java base java util concurrent futuretask run futuretask java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java lang assertionerror expected but found at org testng assert fail assert java at org testng assert failnotequals assert java at org testng assert assertequalsimpl assert java at org testng assert assertequals assert java at org testng assert assertequals assert java at org testng assert assertequals assert java at org apache pulsar client impl patterntopicsconsumerimpltest lambda testautosubscribepatternconsumer patterntopicsconsumerimpltest java at org awaitility core assertioncondition lambda new assertioncondition java at org awaitility core conditionawaiter conditionpoller call conditionawaiter java at org awaitility core conditionawaiter conditionpoller call conditionawaiter java more
| 0
|
66,493
| 8,948,464,313
|
IssuesEvent
|
2019-01-25 02:31:23
|
vuetifyjs/vuetify
|
https://api.github.com/repos/vuetifyjs/vuetify
|
closed
|
[Bug Report] v-data-iterator does not support a loading prop but the docs say it does.
|
T: documentation
|
### Versions and Environment
**Vuetify:** 1.4.3
**Vue:** 2.5.22
**Browsers:** Chrome 71.0.3578.98
**OS:** Mac OS 10.14.2
### Steps to reproduce
Look at the v-data-iterator API docs and note that it states that there is a loading prop.
Create a v-data-iterator and set the loading prop to true. Note how there is no loading bar.
### Expected Behavior
Either the loading prop should work on v-data-iterator OR v-data-iterator should have a progress slot OR the prop described in the API should be removed.
### Actual Behavior
Nothing happens when you use the loading prop.
### Reproduction Link
<a href="https://codepen.io/MatthewAry/pen/RvWKQQ" target="_blank">https://codepen.io/MatthewAry/pen/RvWKQQ</a>
### Other Comments
#4008 An issue was created for this problem previously but was closed by the author before it could be addressed by the team.
<!-- generated by vuetify-issue-helper. DO NOT REMOVE -->
|
1.0
|
[Bug Report] v-data-iterator does not support a loading prop but the docs say it does. - ### Versions and Environment
**Vuetify:** 1.4.3
**Vue:** 2.5.22
**Browsers:** Chrome 71.0.3578.98
**OS:** Mac OS 10.14.2
### Steps to reproduce
Look at the v-data-iterator API docs and note that it states that there is a loading prop.
Create a v-data-iterator and set the loading prop to true. Note how there is no loading bar.
### Expected Behavior
Either the loading prop should work on v-data-iterator OR v-data-iterator should have a progress slot OR the prop described in the API should be removed.
### Actual Behavior
Nothing happens when you use the loading prop.
### Reproduction Link
<a href="https://codepen.io/MatthewAry/pen/RvWKQQ" target="_blank">https://codepen.io/MatthewAry/pen/RvWKQQ</a>
### Other Comments
#4008 An issue was created for this problem previously but was closed by the author before it could be addressed by the team.
<!-- generated by vuetify-issue-helper. DO NOT REMOVE -->
|
non_process
|
v data iterator does not support a loading prop but the docs say it does versions and environment vuetify vue browsers chrome os mac os steps to reproduce look at the v data iterator api docs and note that it states that there is a loading prop create a v data iterator and set the loading prop to true note how there is no loading bar expected behavior either the loading prop should work on v data iterator or v data iterator should have a progress slot or the prop described in the api should be removed actual behavior nothing happens when you use the loading prop reproduction link other comments an issue was created for this problem previously but was closed by the author before it could be addressed by the team
| 0
|
16,129
| 20,381,559,030
|
IssuesEvent
|
2022-02-21 22:44:30
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Suggest auto() in VSCode autocomplete
|
process/candidate kind/improvement team/migrations topic: mongodb
|
## Problem
Current `auto()` doesn't show up as a choice in autocomplete:
<img width="759" alt="CleanShot 2022-02-21 at 16 42 55@2x" src="https://user-images.githubusercontent.com/170299/155034560-ee3bb0e9-c403-4bb2-9705-194d76f18289.png">
## Suggested solution
Make it show up in autocomplete
|
1.0
|
Suggest auto() in VSCode autocomplete - ## Problem
Current `auto()` doesn't show up as a choice in autocomplete:
<img width="759" alt="CleanShot 2022-02-21 at 16 42 55@2x" src="https://user-images.githubusercontent.com/170299/155034560-ee3bb0e9-c403-4bb2-9705-194d76f18289.png">
## Suggested solution
Make it show up in autocomplete
|
process
|
suggest auto in vscode autocomplete problem current auto doesn t show up as a choice in autocomplete img width alt cleanshot at src suggested solution make it show up in autocomplete
| 1
|
12,474
| 14,942,292,444
|
IssuesEvent
|
2021-01-25 21:04:16
|
ClickHouse/ClickHouse
|
https://api.github.com/repos/ClickHouse/ClickHouse
|
closed
|
Extremes transform was already added to pipeline
|
bug comp-processors fuzz prio-minor v20.10-affected v20.8-affected v20.9-affected
|
```
2020.08.26 09:13:35.081818 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> executeQuery: (from [::1]:36574) SELECT * FROM b
2020.08.26 09:13:35.084392 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> ContextAccess (default): Access granted: SELECT(id1, id2, valA, val1, val2) ON default.b
2020.08.26 09:13:35.087388 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id1 UInt32 UInt32(size = 0), val1 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.091278 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id1 UInt32 UInt32(size = 0), val1 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.094011 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id2 UInt32 UInt32(size = 0), val2 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.095734 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.096322 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.102741 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.104310 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Error> : Logical error: 'Extremes transform was already added to pipeline.'.
clickhouse-server: ../src/Common/Exception.cpp:45: DB::Exception::Exception(const std::string &, int): Assertion `false' failed.
2020.08.26 09:13:35.104721 [ 64 ] {} <Trace> BaseDaemon: Received signal 6
2020.08.26 09:13:35.105005 [ 248 ] {} <Fatal> BaseDaemon: ########################################
2020.08.26 09:13:35.105422 [ 248 ] {} <Fatal> BaseDaemon: (version 20.8.1.4470, build id: 79B2358424232F7A) (from thread 97) (query_id: 94018202-348d-482a-a507-d911dfe36336) Received signal Aborted (6)
2020.08.26 09:13:35.105601 [ 248 ] {} <Fatal> BaseDaemon:
2020.08.26 09:13:35.105797 [ 248 ] {} <Fatal> BaseDaemon: Stack trace: 0x7f8f73bf3f47 0x7f8f73bf58b1 0x7f8f73be542a 0x7f8f73be54a2 0x25c981c1 0x30ea5f41 0x311cc2a9 0x311df4cf 0x311fa8c3 0x30316bf8 0x304d5617 0x304d460a 0x30dfccc6 0x30e04158 0x34a8945c 0x34a89c6c 0x34bcba73 0x34bc89ad 0x34bc7838 0x7f8f743b96db 0x7f8f73cd6a3f
2020.08.26 09:13:35.106176 [ 248 ] {} <Fatal> BaseDaemon: 4. /build/glibc-2ORdQG/glibc-2.27/signal/../sysdeps/unix/sysv/linux/raise.c:51: raise @ 0x3ef47 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.106441 [ 248 ] {} <Fatal> BaseDaemon: 5. /build/glibc-2ORdQG/glibc-2.27/stdlib/abort.c:81: abort @ 0x408b1 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.106686 [ 248 ] {} <Fatal> BaseDaemon: 6. /build/glibc-2ORdQG/glibc-2.27/assert/assert.c:89: __assert_fail_base @ 0x3042a in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.107084 [ 248 ] {} <Fatal> BaseDaemon: 7. ? @ 0x304a2 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.107460 [ 248 ] {} <Fatal> BaseDaemon: 8. /build/obj-x86_64-linux-gnu/../src/Common/Exception.cpp:48: DB::Exception::Exception(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int) @ 0x25c981c1 in /workspace/clickhouse
2020.08.26 09:13:35.167617 [ 248 ] {} <Fatal> BaseDaemon: 9. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPipeline.cpp:191: DB::QueryPipeline::addExtremesTransform() @ 0x30ea5f41 in /workspace/clickhouse
2020.08.26 09:13:35.230240 [ 248 ] {} <Fatal> BaseDaemon: 10. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/ExtremesStep.cpp:31: DB::ExtremesStep::transformPipeline(DB::QueryPipeline&) @ 0x311cc2a9 in /workspace/clickhouse
2020.08.26 09:13:35.292104 [ 248 ] {} <Fatal> BaseDaemon: 11. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/ITransformingStep.cpp:44: DB::ITransformingStep::updatePipeline(std::__1::vector<std::__1::unique_ptr<DB::QueryPipeline, std::__1::default_delete<DB::QueryPipeline> >, std::__1::allocator<std::__1::unique_ptr<DB::QueryPipeline, std::__1::default_delete<DB::QueryPipeline> > > >) @ 0x311df4cf in /workspace/clickhouse
2020.08.26 09:13:35.354345 [ 248 ] {} <Fatal> BaseDaemon: 12. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/QueryPlan.cpp:169: DB::QueryPlan::buildQueryPipeline() @ 0x311fa8c3 in /workspace/clickhouse
2020.08.26 09:13:35.403571 [ 248 ] {} <Fatal> BaseDaemon: 13. /build/obj-x86_64-linux-gnu/../src/Interpreters/InterpreterSelectWithUnionQuery.cpp:208: DB::InterpreterSelectWithUnionQuery::execute() @ 0x30316bf8 in /workspace/clickhouse
2020.08.26 09:13:35.455289 [ 248 ] {} <Fatal> BaseDaemon: 14. /build/obj-x86_64-linux-gnu/../src/Interpreters/executeQuery.cpp:389: DB::executeQueryImpl(char const*, char const*, DB::Context&, bool, DB::QueryProcessingStage::Enum, bool, DB::ReadBuffer*) @ 0x304d5617 in /workspace/clickhouse
2020.08.26 09:13:35.506682 [ 248 ] {} <Fatal> BaseDaemon: 15. /build/obj-x86_64-linux-gnu/../src/Interpreters/executeQuery.cpp:675: DB::executeQuery(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, DB::Context&, bool, DB::QueryProcessingStage::Enum, bool) @ 0x304d460a in /workspace/clickhouse
2020.08.26 09:13:35.565299 [ 248 ] {} <Fatal> BaseDaemon: 16. /build/obj-x86_64-linux-gnu/../src/Server/TCPHandler.cpp:253: DB::TCPHandler::runImpl() @ 0x30dfccc6 in /workspace/clickhouse
2020.08.26 09:13:35.623803 [ 248 ] {} <Fatal> BaseDaemon: 17. /build/obj-x86_64-linux-gnu/../src/Server/TCPHandler.cpp:1213: DB::TCPHandler::run() @ 0x30e04158 in /workspace/clickhouse
2020.08.26 09:13:35.690771 [ 248 ] {} <Fatal> BaseDaemon: 18. /build/obj-x86_64-linux-gnu/../contrib/poco/Net/src/TCPServerConnection.cpp:43: Poco::Net::TCPServerConnection::start() @ 0x34a8945c in /workspace/clickhouse
2020.08.26 09:13:35.762628 [ 248 ] {} <Fatal> BaseDaemon: 19. /build/obj-x86_64-linux-gnu/../contrib/poco/Net/src/TCPServerDispatcher.cpp:114: Poco::Net::TCPServerDispatcher::run() @ 0x34a89c6c in /workspace/clickhouse
2020.08.26 09:13:35.831167 [ 248 ] {} <Fatal> BaseDaemon: 20. /build/obj-x86_64-linux-gnu/../contrib/poco/Foundation/src/ThreadPool.cpp:199: Poco::PooledThread::run() @ 0x34bcba73 in /workspace/clickhouse
2020.08.26 09:13:35.877334 [ 70 ] {} <Trace> SystemLog (system.trace_log): Flushing system log, 9 entries to flush
2020.08.26 09:13:35.879794 [ 70 ] {} <Debug> DiskLocal: Reserving 1.00 MiB on disk `default`, having unreserved 433.76 GiB.
2020.08.26 09:13:35.899046 [ 248 ] {} <Fatal> BaseDaemon: 21. /build/obj-x86_64-linux-gnu/../contrib/poco/Foundation/src/Thread.cpp:56: Poco::(anonymous namespace)::RunnableHolder::run() @ 0x34bc89ad in /workspace/clickhouse
```
I found only the CREATE VIEW statement for b
``` sql
2020.08.26 09:13:30.636180 [ 97 ] {22290044-f0f2-4f9d-a8ec-44bccd31e716} <Debug> executeQuery: (from [::1]:36574) CREATE VIEW b AS SELECT * FROM (SELECT * FROM a ANY LEFT JOIN id1 USING (id1)) AS js1 ANY LEFT JOIN id2 USING (id2)
```
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/server.log
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/fuzzer.log
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/main.log
|
1.0
|
Extremes transform was already added to pipeline - ```
2020.08.26 09:13:35.081818 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> executeQuery: (from [::1]:36574) SELECT * FROM b
2020.08.26 09:13:35.084392 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> ContextAccess (default): Access granted: SELECT(id1, id2, valA, val1, val2) ON default.b
2020.08.26 09:13:35.087388 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id1 UInt32 UInt32(size = 0), val1 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.091278 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id1 UInt32 UInt32(size = 0), val1 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.094011 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Debug> HashJoin: Right sample block: id2 UInt32 UInt32(size = 0), val2 UInt8 UInt8(size = 0)
2020.08.26 09:13:35.095734 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.096322 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.102741 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Trace> InterpreterSelectQuery: FetchColumns -> Complete
2020.08.26 09:13:35.104310 [ 97 ] {94018202-348d-482a-a507-d911dfe36336} <Error> : Logical error: 'Extremes transform was already added to pipeline.'.
clickhouse-server: ../src/Common/Exception.cpp:45: DB::Exception::Exception(const std::string &, int): Assertion `false' failed.
2020.08.26 09:13:35.104721 [ 64 ] {} <Trace> BaseDaemon: Received signal 6
2020.08.26 09:13:35.105005 [ 248 ] {} <Fatal> BaseDaemon: ########################################
2020.08.26 09:13:35.105422 [ 248 ] {} <Fatal> BaseDaemon: (version 20.8.1.4470, build id: 79B2358424232F7A) (from thread 97) (query_id: 94018202-348d-482a-a507-d911dfe36336) Received signal Aborted (6)
2020.08.26 09:13:35.105601 [ 248 ] {} <Fatal> BaseDaemon:
2020.08.26 09:13:35.105797 [ 248 ] {} <Fatal> BaseDaemon: Stack trace: 0x7f8f73bf3f47 0x7f8f73bf58b1 0x7f8f73be542a 0x7f8f73be54a2 0x25c981c1 0x30ea5f41 0x311cc2a9 0x311df4cf 0x311fa8c3 0x30316bf8 0x304d5617 0x304d460a 0x30dfccc6 0x30e04158 0x34a8945c 0x34a89c6c 0x34bcba73 0x34bc89ad 0x34bc7838 0x7f8f743b96db 0x7f8f73cd6a3f
2020.08.26 09:13:35.106176 [ 248 ] {} <Fatal> BaseDaemon: 4. /build/glibc-2ORdQG/glibc-2.27/signal/../sysdeps/unix/sysv/linux/raise.c:51: raise @ 0x3ef47 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.106441 [ 248 ] {} <Fatal> BaseDaemon: 5. /build/glibc-2ORdQG/glibc-2.27/stdlib/abort.c:81: abort @ 0x408b1 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.106686 [ 248 ] {} <Fatal> BaseDaemon: 6. /build/glibc-2ORdQG/glibc-2.27/assert/assert.c:89: __assert_fail_base @ 0x3042a in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.107084 [ 248 ] {} <Fatal> BaseDaemon: 7. ? @ 0x304a2 in /usr/lib/debug/lib/x86_64-linux-gnu/libc-2.27.so
2020.08.26 09:13:35.107460 [ 248 ] {} <Fatal> BaseDaemon: 8. /build/obj-x86_64-linux-gnu/../src/Common/Exception.cpp:48: DB::Exception::Exception(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int) @ 0x25c981c1 in /workspace/clickhouse
2020.08.26 09:13:35.167617 [ 248 ] {} <Fatal> BaseDaemon: 9. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPipeline.cpp:191: DB::QueryPipeline::addExtremesTransform() @ 0x30ea5f41 in /workspace/clickhouse
2020.08.26 09:13:35.230240 [ 248 ] {} <Fatal> BaseDaemon: 10. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/ExtremesStep.cpp:31: DB::ExtremesStep::transformPipeline(DB::QueryPipeline&) @ 0x311cc2a9 in /workspace/clickhouse
2020.08.26 09:13:35.292104 [ 248 ] {} <Fatal> BaseDaemon: 11. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/ITransformingStep.cpp:44: DB::ITransformingStep::updatePipeline(std::__1::vector<std::__1::unique_ptr<DB::QueryPipeline, std::__1::default_delete<DB::QueryPipeline> >, std::__1::allocator<std::__1::unique_ptr<DB::QueryPipeline, std::__1::default_delete<DB::QueryPipeline> > > >) @ 0x311df4cf in /workspace/clickhouse
2020.08.26 09:13:35.354345 [ 248 ] {} <Fatal> BaseDaemon: 12. /build/obj-x86_64-linux-gnu/../src/Processors/QueryPlan/QueryPlan.cpp:169: DB::QueryPlan::buildQueryPipeline() @ 0x311fa8c3 in /workspace/clickhouse
2020.08.26 09:13:35.403571 [ 248 ] {} <Fatal> BaseDaemon: 13. /build/obj-x86_64-linux-gnu/../src/Interpreters/InterpreterSelectWithUnionQuery.cpp:208: DB::InterpreterSelectWithUnionQuery::execute() @ 0x30316bf8 in /workspace/clickhouse
2020.08.26 09:13:35.455289 [ 248 ] {} <Fatal> BaseDaemon: 14. /build/obj-x86_64-linux-gnu/../src/Interpreters/executeQuery.cpp:389: DB::executeQueryImpl(char const*, char const*, DB::Context&, bool, DB::QueryProcessingStage::Enum, bool, DB::ReadBuffer*) @ 0x304d5617 in /workspace/clickhouse
2020.08.26 09:13:35.506682 [ 248 ] {} <Fatal> BaseDaemon: 15. /build/obj-x86_64-linux-gnu/../src/Interpreters/executeQuery.cpp:675: DB::executeQuery(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, DB::Context&, bool, DB::QueryProcessingStage::Enum, bool) @ 0x304d460a in /workspace/clickhouse
2020.08.26 09:13:35.565299 [ 248 ] {} <Fatal> BaseDaemon: 16. /build/obj-x86_64-linux-gnu/../src/Server/TCPHandler.cpp:253: DB::TCPHandler::runImpl() @ 0x30dfccc6 in /workspace/clickhouse
2020.08.26 09:13:35.623803 [ 248 ] {} <Fatal> BaseDaemon: 17. /build/obj-x86_64-linux-gnu/../src/Server/TCPHandler.cpp:1213: DB::TCPHandler::run() @ 0x30e04158 in /workspace/clickhouse
2020.08.26 09:13:35.690771 [ 248 ] {} <Fatal> BaseDaemon: 18. /build/obj-x86_64-linux-gnu/../contrib/poco/Net/src/TCPServerConnection.cpp:43: Poco::Net::TCPServerConnection::start() @ 0x34a8945c in /workspace/clickhouse
2020.08.26 09:13:35.762628 [ 248 ] {} <Fatal> BaseDaemon: 19. /build/obj-x86_64-linux-gnu/../contrib/poco/Net/src/TCPServerDispatcher.cpp:114: Poco::Net::TCPServerDispatcher::run() @ 0x34a89c6c in /workspace/clickhouse
2020.08.26 09:13:35.831167 [ 248 ] {} <Fatal> BaseDaemon: 20. /build/obj-x86_64-linux-gnu/../contrib/poco/Foundation/src/ThreadPool.cpp:199: Poco::PooledThread::run() @ 0x34bcba73 in /workspace/clickhouse
2020.08.26 09:13:35.877334 [ 70 ] {} <Trace> SystemLog (system.trace_log): Flushing system log, 9 entries to flush
2020.08.26 09:13:35.879794 [ 70 ] {} <Debug> DiskLocal: Reserving 1.00 MiB on disk `default`, having unreserved 433.76 GiB.
2020.08.26 09:13:35.899046 [ 248 ] {} <Fatal> BaseDaemon: 21. /build/obj-x86_64-linux-gnu/../contrib/poco/Foundation/src/Thread.cpp:56: Poco::(anonymous namespace)::RunnableHolder::run() @ 0x34bc89ad in /workspace/clickhouse
```
I found only the CREATE VIEW statement for b
``` sql
2020.08.26 09:13:30.636180 [ 97 ] {22290044-f0f2-4f9d-a8ec-44bccd31e716} <Debug> executeQuery: (from [::1]:36574) CREATE VIEW b AS SELECT * FROM (SELECT * FROM a ANY LEFT JOIN id1 USING (id1)) AS js1 ANY LEFT JOIN id2 USING (id2)
```
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/server.log
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/fuzzer.log
https://clickhouse-test-reports.s3.yandex.net/13860/dc9ca2a878fcdae624318994a3a8c306dc74fb2a/fuzzer/main.log
|
process
|
extremes transform was already added to pipeline executequery from select from b contextaccess default access granted select vala on default b hashjoin right sample block size size hashjoin right sample block size size hashjoin right sample block size size interpreterselectquery fetchcolumns complete interpreterselectquery fetchcolumns complete interpreterselectquery fetchcolumns complete logical error extremes transform was already added to pipeline clickhouse server src common exception cpp db exception exception const std string int assertion false failed basedaemon received signal basedaemon basedaemon version build id from thread query id received signal aborted basedaemon basedaemon stack trace basedaemon build glibc glibc signal sysdeps unix sysv linux raise c raise in usr lib debug lib linux gnu libc so basedaemon build glibc glibc stdlib abort c abort in usr lib debug lib linux gnu libc so basedaemon build glibc glibc assert assert c assert fail base in usr lib debug lib linux gnu libc so basedaemon in usr lib debug lib linux gnu libc so basedaemon build obj linux gnu src common exception cpp db exception exception std basic string std allocator const int in workspace clickhouse basedaemon build obj linux gnu src processors querypipeline cpp db querypipeline addextremestransform in workspace clickhouse basedaemon build obj linux gnu src processors queryplan extremesstep cpp db extremesstep transformpipeline db querypipeline in workspace clickhouse basedaemon build obj linux gnu src processors queryplan itransformingstep cpp db itransformingstep updatepipeline std vector std allocator in workspace clickhouse basedaemon build obj linux gnu src processors queryplan queryplan cpp db queryplan buildquerypipeline in workspace clickhouse basedaemon build obj linux gnu src interpreters interpreterselectwithunionquery cpp db interpreterselectwithunionquery execute in workspace clickhouse basedaemon build obj linux gnu src interpreters executequery cpp db executequeryimpl char const char const db context bool db queryprocessingstage enum bool db readbuffer in workspace clickhouse basedaemon build obj linux gnu src interpreters executequery cpp db executequery std basic string std allocator const db context bool db queryprocessingstage enum bool in workspace clickhouse basedaemon build obj linux gnu src server tcphandler cpp db tcphandler runimpl in workspace clickhouse basedaemon build obj linux gnu src server tcphandler cpp db tcphandler run in workspace clickhouse basedaemon build obj linux gnu contrib poco net src tcpserverconnection cpp poco net tcpserverconnection start in workspace clickhouse basedaemon build obj linux gnu contrib poco net src tcpserverdispatcher cpp poco net tcpserverdispatcher run in workspace clickhouse basedaemon build obj linux gnu contrib poco foundation src threadpool cpp poco pooledthread run in workspace clickhouse systemlog system trace log flushing system log entries to flush disklocal reserving mib on disk default having unreserved gib basedaemon build obj linux gnu contrib poco foundation src thread cpp poco anonymous namespace runnableholder run in workspace clickhouse i found only the create view statement for b sql executequery from create view b as select from select from a any left join using as any left join using
| 1
|
641,354
| 20,824,883,042
|
IssuesEvent
|
2022-03-18 19:29:50
|
MoveOnOrg/Spoke
|
https://api.github.com/repos/MoveOnOrg/Spoke
|
closed
|
Feature Request: Indication of Which Organization You Are In
|
A-Admin UI/UX O-NYCET priority
|
**Problem**
With a Multi-Org set up, once you select an org from the drop down, there is no indication of which org you are in besides the organization id in the url. Looking at the UI itself, however, I can't tell which organization I'm in.
**Solution**
Have a header that says the name of the org at the top or highlight the org you are in within the drop down menu.
**Context**
Header could be above the "Campaigns" title or some marker in the drop down menu
<img width="1435" alt="Screen Shot 2020-11-09 at 12 06 31 PM" src="https://user-images.githubusercontent.com/67773100/98572877-2400ed00-2284-11eb-97f1-e26aea25d097.png">
<img width="562" alt="Screen Shot 2020-11-09 at 12 06 41 PM" src="https://user-images.githubusercontent.com/67773100/98572878-2400ed00-2284-11eb-8414-e2d013a8a81d.png">
|
1.0
|
Feature Request: Indication of Which Organization You Are In - **Problem**
With a Multi-Org set up, once you select an org from the drop down, there is no indication of which org you are in besides the organization id in the url. Looking at the UI itself, however, I can't tell which organization I'm in.
**Solution**
Have a header that says the name of the org at the top or highlight the org you are in within the drop down menu.
**Context**
Header could be above the "Campaigns" title or some marker in the drop down menu
<img width="1435" alt="Screen Shot 2020-11-09 at 12 06 31 PM" src="https://user-images.githubusercontent.com/67773100/98572877-2400ed00-2284-11eb-97f1-e26aea25d097.png">
<img width="562" alt="Screen Shot 2020-11-09 at 12 06 41 PM" src="https://user-images.githubusercontent.com/67773100/98572878-2400ed00-2284-11eb-8414-e2d013a8a81d.png">
|
non_process
|
feature request indication of which organization you are in problem with a multi org set up once you select an org from the drop down there is no indication of which org you are in besides the organization id in the url looking at the ui itself however i can t tell which organization i m in solution have a header that says the name of the org at the top or highlight the org you are in within the drop down menu context header could be above the campaigns title or some marker in the drop down menu img width alt screen shot at pm src img width alt screen shot at pm src
| 0
|
15,814
| 20,014,020,865
|
IssuesEvent
|
2022-02-01 10:09:53
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
closed
|
Run a team retro post-Design System Day
|
🕔 hours process
|
## What
After Design System Day, run a retro with organisers (Design System team) to reflect on the event.
## Why
To celebrate the successes and identify improvements for future events.
## Who needs to know about this
Community Manager
## Done when
- [x] Retro organised
- [x] Retro run
- [ ] Improvements identified
|
1.0
|
Run a team retro post-Design System Day - ## What
After Design System Day, run a retro with organisers (Design System team) to reflect on the event.
## Why
To celebrate the successes and identify improvements for future events.
## Who needs to know about this
Community Manager
## Done when
- [x] Retro organised
- [x] Retro run
- [ ] Improvements identified
|
process
|
run a team retro post design system day what after design system day run a retro with organisers design system team to reflect on the event why to celebrate the successes and identify improvements for future events who needs to know about this community manager done when retro organised retro run improvements identified
| 1
|
131,820
| 12,491,177,744
|
IssuesEvent
|
2020-06-01 03:07:19
|
Spedcord/issue-tracker
|
https://api.github.com/repos/Spedcord/issue-tracker
|
reopened
|
Credits
|
documentation project: client project: discord-bot project: server
|
This issue addresses everyone who (kind of) helped Spedcord by either contributing or by providing a library.
RealCerus: Developed basically everything (lol)\
[Lukaesebrot](https://github.com/Lukaesebrot): Made a few web pages and made the [javalin-api-library](https://github.com/Lukaesebrot/javalin-api-library)\
[HayateLaTech](https://github.com/HayateLaTech): Made the [OAuth2Discord](https://github.com/HayateLaTech/OAuth2Discord) library\
[JohnnyJayJay](https://github.com/JohnnyJayJay): Made the [discord-api-command](https://github.com/JohnnyJayJay/discord-api-command) library\
[DV8FromTheWorld](https://github.com/DV8FromTheWorld): Made the [JDA](https://github.com/DV8FromTheWorld/JDA) library\
[Google](https://github.com/google): Made the [gson](https://github.com/google/gson) library\
[IgnaceMaes](https://github.com/IgnaceMaes): Made the [MaterialSkin](https://github.com/IgnaceMaes/MaterialSkin) NuGet package\
[Newtonsoft](https://www.newtonsoft.com): Made the [Json.NET](https://www.newtonsoft.com/json) NuGet package\
[restsharp](https://github.com/restsharp): Made the [RestSharp](https://github.com/restsharp/RestSharp) NuGet package
|
1.0
|
Credits - This issue addresses everyone who (kind of) helped Spedcord by either contributing or by providing a library.
RealCerus: Developed basically everything (lol)\
[Lukaesebrot](https://github.com/Lukaesebrot): Made a few web pages and made the [javalin-api-library](https://github.com/Lukaesebrot/javalin-api-library)\
[HayateLaTech](https://github.com/HayateLaTech): Made the [OAuth2Discord](https://github.com/HayateLaTech/OAuth2Discord) library\
[JohnnyJayJay](https://github.com/JohnnyJayJay): Made the [discord-api-command](https://github.com/JohnnyJayJay/discord-api-command) library\
[DV8FromTheWorld](https://github.com/DV8FromTheWorld): Made the [JDA](https://github.com/DV8FromTheWorld/JDA) library\
[Google](https://github.com/google): Made the [gson](https://github.com/google/gson) library\
[IgnaceMaes](https://github.com/IgnaceMaes): Made the [MaterialSkin](https://github.com/IgnaceMaes/MaterialSkin) NuGet package\
[Newtonsoft](https://www.newtonsoft.com): Made the [Json.NET](https://www.newtonsoft.com/json) NuGet package\
[restsharp](https://github.com/restsharp): Made the [RestSharp](https://github.com/restsharp/RestSharp) NuGet package
|
non_process
|
credits this issue addresses everyone who kind of helped spedcord by either contributing or by providing a library realcerus developed basically everything lol made a few web pages and made the made the library made the library made the library made the library made the nuget package made the nuget package made the nuget package
| 0
|
17,749
| 23,663,156,031
|
IssuesEvent
|
2022-08-26 17:42:38
|
radis/radis
|
https://api.github.com/repos/radis/radis
|
reopened
|
add Raman spectra
|
enhancement post-process
|
Use RADIS to post-process Raman spectra:
- [ ] Add Raman as a spectral quantity
- [ ] Add a `raman_spectrum` function
- [ ] Add docs
- [ ] Add examples
A more complex project would be to generate Raman spectra from tabulated data. See discussion on Gitter:
[](https://gitter.im/radis-radiation/community?at=5d641310c8228962accb0e98)
|
1.0
|
add Raman spectra - Use RADIS to post-process Raman spectra:
- [ ] Add Raman as a spectral quantity
- [ ] Add a `raman_spectrum` function
- [ ] Add docs
- [ ] Add examples
A more complex project would be to generate Raman spectra from tabulated data. See discussion on Gitter:
[](https://gitter.im/radis-radiation/community?at=5d641310c8228962accb0e98)
|
process
|
add raman spectra use radis to post process raman spectra add raman as a spectral quantity add a raman spectrum function add docs add examples a more complex project would be to generate raman spectra from tabulated data see discussion on gitter
| 1
|
15,962
| 20,176,951,428
|
IssuesEvent
|
2022-02-10 15:14:21
|
ooi-data/CE09OSSM-SBD12-05-WAVSSA000-telemetered-wavss_a_dcl_motion
|
https://api.github.com/repos/ooi-data/CE09OSSM-SBD12-05-WAVSSA000-telemetered-wavss_a_dcl_motion
|
opened
|
🛑 Processing failed: ValueError
|
process
|
## Overview
`ValueError` found in `processing_task` task during run ended on 2022-02-10T15:14:21.047591.
## Details
Flow name: `CE09OSSM-SBD12-05-WAVSSA000-telemetered-wavss_a_dcl_motion`
Task name: `processing_task`
Error type: `ValueError`
Error message: shape of data to append is not compatible with the array; all dimensions must match except for the dimension being appended
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 157, in processing
process_dataset(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 147, in process_dataset
append_to_zarr(mod_ds, store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2319, in _append_nosync
raise ValueError('shape of data to append is not compatible with the array; '
ValueError: shape of data to append is not compatible with the array; all dimensions must match except for the dimension being appended
```
</details>
|
1.0
|
🛑 Processing failed: ValueError - ## Overview
`ValueError` found in `processing_task` task during run ended on 2022-02-10T15:14:21.047591.
## Details
Flow name: `CE09OSSM-SBD12-05-WAVSSA000-telemetered-wavss_a_dcl_motion`
Task name: `processing_task`
Error type: `ValueError`
Error message: shape of data to append is not compatible with the array; all dimensions must match except for the dimension being appended
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 157, in processing
process_dataset(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 147, in process_dataset
append_to_zarr(mod_ds, store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2319, in _append_nosync
raise ValueError('shape of data to append is not compatible with the array; '
ValueError: shape of data to append is not compatible with the array; all dimensions must match except for the dimension being appended
```
</details>
|
process
|
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name telemetered wavss a dcl motion task name processing task error type valueerror error message shape of data to append is not compatible with the array all dimensions must match except for the dimension being appended traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing process dataset file srv conda envs notebook lib site packages ooi harvester processor init py line in process dataset append to zarr mod ds store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages zarr core py line in append return self write op self append nosync data axis axis file srv conda envs notebook lib site packages zarr core py line in write op return self synchronized op f args kwargs file srv conda envs notebook lib site packages zarr core py line in synchronized op result f args kwargs file srv conda envs notebook lib site packages zarr core py line in append nosync raise valueerror shape of data to append is not compatible with the array valueerror shape of data to append is not compatible with the array all dimensions must match except for the dimension being appended
| 1
|
11,468
| 14,290,241,326
|
IssuesEvent
|
2020-11-23 20:34:08
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
opened
|
Missing input validation for the AWS.S3.Bucket Read and Right Cloud Security Policies
|
bug team:data processing
|
### Describe the bug
The AWS.S3.Bucket Read and Right Cloud Security policies are missing input validations which is causing an error in a customers (DropBox) deployment as their resources are missing key "grants"
### Steps to reproduce
Steps to reproduce the behavior:
1. Add S3 bucket without key grants as a resource
2. See errors generated from AWS.S3.Bucket Read and Right Cloud Security policies
### Expected behavior
if 'Grants' in resource and resource['Grants'] is not None:`
```
{
“Arn”: “arn:aws:s3:::something-prodsec”,
“LoggingPolicy”: null,
“Policy”: null,
“MFADelete”: null,
“ObjectLockConfiguration”: null,
“LifecycleRules”: null,
“Grants”: null,
“Versioning”: null,
“EncryptionRules”: null,
“PublicAccessBlockConfiguration”: null,
“AccountId”: “123456789”,
“Tags”: {
“something”: “ttd”,
“user”: “prodsec”,
“service”: “-”,
“owner”: “-”,
“team”: “-”
},
“Region”: “us-west-2",
“ResourceType”: “AWS.S3.Bucket”,
“Name”: “something-prodsec”,
“ResourceId”: “arn:aws:s3:::something-prodsec”,
“TimeCreated”: “2020-06-27T00:11:11Z”,
“Owner”: {
“DisplayName”: “aws-corp”,
“ID”: “15e2a86ec386eb589bcad5377520a52e0e0717cdc970d2a643d3b60374984dda”
}
}
```
### Environment
- Panther version v1.12
### Screenshots


|
1.0
|
Missing input validation for the AWS.S3.Bucket Read and Right Cloud Security Policies - ### Describe the bug
The AWS.S3.Bucket Read and Right Cloud Security policies are missing input validations which is causing an error in a customers (DropBox) deployment as their resources are missing key "grants"
### Steps to reproduce
Steps to reproduce the behavior:
1. Add S3 bucket without key grants as a resource
2. See errors generated from AWS.S3.Bucket Read and Right Cloud Security policies
### Expected behavior
if 'Grants' in resource and resource['Grants'] is not None:`
```
{
“Arn”: “arn:aws:s3:::something-prodsec”,
“LoggingPolicy”: null,
“Policy”: null,
“MFADelete”: null,
“ObjectLockConfiguration”: null,
“LifecycleRules”: null,
“Grants”: null,
“Versioning”: null,
“EncryptionRules”: null,
“PublicAccessBlockConfiguration”: null,
“AccountId”: “123456789”,
“Tags”: {
“something”: “ttd”,
“user”: “prodsec”,
“service”: “-”,
“owner”: “-”,
“team”: “-”
},
“Region”: “us-west-2",
“ResourceType”: “AWS.S3.Bucket”,
“Name”: “something-prodsec”,
“ResourceId”: “arn:aws:s3:::something-prodsec”,
“TimeCreated”: “2020-06-27T00:11:11Z”,
“Owner”: {
“DisplayName”: “aws-corp”,
“ID”: “15e2a86ec386eb589bcad5377520a52e0e0717cdc970d2a643d3b60374984dda”
}
}
```
### Environment
- Panther version v1.12
### Screenshots


|
process
|
missing input validation for the aws bucket read and right cloud security policies describe the bug the aws bucket read and right cloud security policies are missing input validations which is causing an error in a customers dropbox deployment as their resources are missing key grants steps to reproduce steps to reproduce the behavior add bucket without key grants as a resource see errors generated from aws bucket read and right cloud security policies expected behavior if grants in resource and resource is not none “arn” “arn aws something prodsec” “loggingpolicy” null “policy” null “mfadelete” null “objectlockconfiguration” null “lifecyclerules” null “grants” null “versioning” null “encryptionrules” null “publicaccessblockconfiguration” null “accountid” “ ” “tags” “something” “ttd” “user” “prodsec” “service” “ ” “owner” “ ” “team” “ ” “region” “us west “resourcetype” “aws bucket” “name” “something prodsec” “resourceid” “arn aws something prodsec” “timecreated” “ ” “owner” “displayname” “aws corp” “id” “ ” environment panther version screenshots
| 1
|
1,241
| 3,779,220,109
|
IssuesEvent
|
2016-03-18 06:46:49
|
imperial-photonics/FLIMfit
|
https://api.github.com/repos/imperial-photonics/FLIMfit
|
closed
|
Fails to handle German characters in file names (as reported by Berlin team)
|
bug GlobalProcessingFrontEnd trivial
|
We did a test with the OME format FLIM collected today our LaVision with Convallaria. Your program will not load this format at all, please see the error message below:
Invalid character was detected.
Error using imfinfo (line 100)
Unable to open file "/Users/zoltancseresnyes/Documents/FLIMfit/FLIM-data-Jannike/130702_FLIM fu r Zoltan - OME-Tiff_13-28-39/FLIM fA r Zoltan - OME-Tiff_13-28-39_TDC_C00_xyz-Table Z0000_Time Time0000.ome.tif" for reading.
Error in load_flim_file (line 60)
Error in flim_data_series/load_single (line 68)
Error in flim_data_series_controller/load_single (line 160)
Error in front_end_menu_controller/menu_file_load_single_callback (line 480)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_single_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
Error using flim_data_series/load_data_series (line 42)
Path does not exist
Error in flim_data_series_controller/load_data_series (line 107)
Error in front_end_menu_controller/menu_file_load_tcspc_callback (line 500)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_tcspc_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
Error using imfinfo (line 100)
Unable to open file "/Users/zoltancseresnyes/Documents/FLIMfit/FLIM-data-Jannike/130702_FLIM fu r Zoltan - OME-Tiff_13-28-39/FLIM fA r Zoltan - OME-Tiff_13-28-39_TDC_C00_xyz-Table Z0000_Time Time0000.ome.tif" for reading.
Error in load_flim_file (line 60)
Error in flim_data_series/load_single (line 68)
Error in flim_data_series_controller/load_single (line 160)
Error in front_end_menu_controller/menu_file_load_single_callback (line 480)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_single_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
**Note**: This issue has been automatically migrated from Bitbucket
Created by @imunro on 2013-07-09 16:19:22+00:00, last updated: 2013-08-04 13:42:26+00:00
|
1.0
|
Fails to handle German characters in file names (as reported by Berlin team) - We did a test with the OME format FLIM collected today our LaVision with Convallaria. Your program will not load this format at all, please see the error message below:
Invalid character was detected.
Error using imfinfo (line 100)
Unable to open file "/Users/zoltancseresnyes/Documents/FLIMfit/FLIM-data-Jannike/130702_FLIM fu r Zoltan - OME-Tiff_13-28-39/FLIM fA r Zoltan - OME-Tiff_13-28-39_TDC_C00_xyz-Table Z0000_Time Time0000.ome.tif" for reading.
Error in load_flim_file (line 60)
Error in flim_data_series/load_single (line 68)
Error in flim_data_series_controller/load_single (line 160)
Error in front_end_menu_controller/menu_file_load_single_callback (line 480)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_single_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
Error using flim_data_series/load_data_series (line 42)
Path does not exist
Error in flim_data_series_controller/load_data_series (line 107)
Error in front_end_menu_controller/menu_file_load_tcspc_callback (line 500)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_tcspc_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
Error using imfinfo (line 100)
Unable to open file "/Users/zoltancseresnyes/Documents/FLIMfit/FLIM-data-Jannike/130702_FLIM fu r Zoltan - OME-Tiff_13-28-39/FLIM fA r Zoltan - OME-Tiff_13-28-39_TDC_C00_xyz-Table Z0000_Time Time0000.ome.tif" for reading.
Error in load_flim_file (line 60)
Error in flim_data_series/load_single (line 68)
Error in flim_data_series_controller/load_single (line 160)
Error in front_end_menu_controller/menu_file_load_single_callback (line 480)
Error in front_end_menu_controller/set_callbacks/@(varargin)obj.menu_file_load_single_callback(varargin{:})
Error using waitfor
Error while evaluating uimenu Callback
**Note**: This issue has been automatically migrated from Bitbucket
Created by @imunro on 2013-07-09 16:19:22+00:00, last updated: 2013-08-04 13:42:26+00:00
|
process
|
fails to handle german characters in file names as reported by berlin team we did a test with the ome format flim collected today our lavision with convallaria your program will not load this format at all please see the error message below invalid character was detected error using imfinfo line unable to open file users zoltancseresnyes documents flimfit flim data jannike flim fu r zoltan ome tiff flim fa r zoltan ome tiff tdc xyz table time ome tif for reading error in load flim file line error in flim data series load single line error in flim data series controller load single line error in front end menu controller menu file load single callback line error in front end menu controller set callbacks varargin obj menu file load single callback varargin error using waitfor error while evaluating uimenu callback error using flim data series load data series line path does not exist error in flim data series controller load data series line error in front end menu controller menu file load tcspc callback line error in front end menu controller set callbacks varargin obj menu file load tcspc callback varargin error using waitfor error while evaluating uimenu callback error using imfinfo line unable to open file users zoltancseresnyes documents flimfit flim data jannike flim fu r zoltan ome tiff flim fa r zoltan ome tiff tdc xyz table time ome tif for reading error in load flim file line error in flim data series load single line error in flim data series controller load single line error in front end menu controller menu file load single callback line error in front end menu controller set callbacks varargin obj menu file load single callback varargin error using waitfor error while evaluating uimenu callback note this issue has been automatically migrated from bitbucket created by imunro on last updated
| 1
|
84,013
| 10,346,995,161
|
IssuesEvent
|
2019-09-04 16:21:52
|
Azure/Azure-Functions
|
https://api.github.com/repos/Azure/Azure-Functions
|
closed
|
Cost of pre-warmed instances in premium plan
|
documentation
|
Let's say I chose 2 pre-warmed instances for app scale out. I understand I would be billed for these 2 instance throughout the month. Now, let's assume that the load has increased, and it scales to 5. Now, the [documentation ](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#pre-warmed-instances)mentions that it starts warming up next instance in anticipation of next scale operation.
> As the app scales out, it first scales into the pre-warmed instances. Additional instances continue to buffer out and warm immediately in preparation for the next scale operation. By having a buffer of pre-warmed instances, you can effectively avoid cold start latencies.
So, (I think) a 6th instance will start warming up in anticipation (while only 5 are serving actual user requests). Does this 6th instance get charged at this point? OR does it get charged when this is brought into service to serve user requests?
I am trying to estimate the cost, hence the question.
I have read the following docs about pricing of premium plan:
https://azure.microsoft.com/en-us/pricing/details/functions/
https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan
https://github.com/Azure/Azure-Functions/blob/master/functions-premium-plan/overview.md
|
1.0
|
Cost of pre-warmed instances in premium plan - Let's say I chose 2 pre-warmed instances for app scale out. I understand I would be billed for these 2 instance throughout the month. Now, let's assume that the load has increased, and it scales to 5. Now, the [documentation ](https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#pre-warmed-instances)mentions that it starts warming up next instance in anticipation of next scale operation.
> As the app scales out, it first scales into the pre-warmed instances. Additional instances continue to buffer out and warm immediately in preparation for the next scale operation. By having a buffer of pre-warmed instances, you can effectively avoid cold start latencies.
So, (I think) a 6th instance will start warming up in anticipation (while only 5 are serving actual user requests). Does this 6th instance get charged at this point? OR does it get charged when this is brought into service to serve user requests?
I am trying to estimate the cost, hence the question.
I have read the following docs about pricing of premium plan:
https://azure.microsoft.com/en-us/pricing/details/functions/
https://docs.microsoft.com/en-us/azure/azure-functions/functions-premium-plan
https://github.com/Azure/Azure-Functions/blob/master/functions-premium-plan/overview.md
|
non_process
|
cost of pre warmed instances in premium plan let s say i chose pre warmed instances for app scale out i understand i would be billed for these instance throughout the month now let s assume that the load has increased and it scales to now the that it starts warming up next instance in anticipation of next scale operation as the app scales out it first scales into the pre warmed instances additional instances continue to buffer out and warm immediately in preparation for the next scale operation by having a buffer of pre warmed instances you can effectively avoid cold start latencies so i think a instance will start warming up in anticipation while only are serving actual user requests does this instance get charged at this point or does it get charged when this is brought into service to serve user requests i am trying to estimate the cost hence the question i have read the following docs about pricing of premium plan
| 0
|
20,782
| 27,519,166,891
|
IssuesEvent
|
2023-03-06 14:01:30
|
camunda/issues
|
https://api.github.com/repos/camunda/issues
|
opened
|
Delete selected version of a process definition
|
component:connectors component:operate component:zeebe component:zeebe-process-automation public kind:epic feature-parity potential:8.3 riskAssessment:completed
|
### Value Proposition Statement
Delete Process Definition to free up storage, declutter user interface and prevent errors.
### User Problem
- I have multiple versions of Process Definitions that are not being used anymore. I cannot delete them what can cause accidentally start process in the older version.
- A bloated database of process data, takes too much space
- I cannot stop creation of the new process instances, when there is a timer start event (cycle)
- Currently, there is no way to delete process definition form Zeebe cluster
- During development, I have multiple versions with bugs as I was testing the process
### User Stories
- As a Developer, I can delete a selected version of deployed process definition and all process instances of this version of process definition (1 operation) via Operate UI.
- As a Developer, while using this feature, I can read basic information, what will be deleted
- As a Developer, I can see the progress of deletion in Operations Panel
- As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of process definition
### Implementation Notes
- Together with Process Definition deletion, all dependent data of this process definition version like tasks and decision instances will be deleted - only instances, not definitions. It will affect only child processes, call activity and business rule tasks - nothing in parent instances.
- With this iteration, we'll deliver simplified frontend - the enhancements will be covered in the 4th iteration.
**Why this scope:**
- Delete Process Definition has dependencies with Decisions and Tasks objects so it's more complex to build that
- Frontend part is split into 2 iterations, so we're able to ship it with the simplified frontend and then enhance it to final version
This is the second, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/)
**Iterations:**
1. https://github.com/camunda/product-hub/issues/94
2. https://github.com/camunda/product-hub/issues/615
3. https://github.com/camunda/product-hub/issues/619
4. https://github.com/camunda/product-hub/issues/620
### Breakdown
> This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate.
#### Discovery phase ##
<!-- Example: link to "Conduct customer interview with xyz" -->
#### Define phase ##
#### Additional security testing
* Test for security login and monitoring failure
* Ensure current permission management remains in effect
*
<!-- Consider: UI, UX, technical design, documentation design -->
<!-- Example: link to "Define User-Journey Flow" or "Define target architecture" -->
Design Planning
* Designer assigned: yes
* Assignee: @gastonpillet01
* Design Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit#heading=h.c4qtk4282c6g
* Research Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit
Design Deliverables
* [Design handover](https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=0%3A1)
* Figma File for HFW - https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=402%3A2459
* Still to add iterative breakdown
Documentation Planning
<!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. -->
<!-- Briefly describe the anticipated impact to documentation. -->
<!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ -->
Risk Management <!-- add link to risk management issue -->
* Risk Class: <!-- e.g. very low | low | medium | high | very high -->
* Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept -->
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. -->
Zeebe Epic
* DRI: @remcowesterhoud
* https://github.com/camunda/zeebe/issues/9576
|
1.0
|
Delete selected version of a process definition - ### Value Proposition Statement
Delete Process Definition to free up storage, declutter user interface and prevent errors.
### User Problem
- I have multiple versions of Process Definitions that are not being used anymore. I cannot delete them what can cause accidentally start process in the older version.
- A bloated database of process data, takes too much space
- I cannot stop creation of the new process instances, when there is a timer start event (cycle)
- Currently, there is no way to delete process definition form Zeebe cluster
- During development, I have multiple versions with bugs as I was testing the process
### User Stories
- As a Developer, I can delete a selected version of deployed process definition and all process instances of this version of process definition (1 operation) via Operate UI.
- As a Developer, while using this feature, I can read basic information, what will be deleted
- As a Developer, I can see the progress of deletion in Operations Panel
- As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of process definition
### Implementation Notes
- Together with Process Definition deletion, all dependent data of this process definition version like tasks and decision instances will be deleted - only instances, not definitions. It will affect only child processes, call activity and business rule tasks - nothing in parent instances.
- With this iteration, we'll deliver simplified frontend - the enhancements will be covered in the 4th iteration.
**Why this scope:**
- Delete Process Definition has dependencies with Decisions and Tasks objects so it's more complex to build that
- Frontend part is split into 2 iterations, so we're able to ship it with the simplified frontend and then enhance it to final version
This is the second, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/)
**Iterations:**
1. https://github.com/camunda/product-hub/issues/94
2. https://github.com/camunda/product-hub/issues/615
3. https://github.com/camunda/product-hub/issues/619
4. https://github.com/camunda/product-hub/issues/620
### Breakdown
> This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate.
#### Discovery phase ##
<!-- Example: link to "Conduct customer interview with xyz" -->
#### Define phase ##
#### Additional security testing
* Test for security login and monitoring failure
* Ensure current permission management remains in effect
*
<!-- Consider: UI, UX, technical design, documentation design -->
<!-- Example: link to "Define User-Journey Flow" or "Define target architecture" -->
Design Planning
* Designer assigned: yes
* Assignee: @gastonpillet01
* Design Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit#heading=h.c4qtk4282c6g
* Research Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit
Design Deliverables
* [Design handover](https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=0%3A1)
* Figma File for HFW - https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=402%3A2459
* Still to add iterative breakdown
Documentation Planning
<!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. -->
<!-- Briefly describe the anticipated impact to documentation. -->
<!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ -->
Risk Management <!-- add link to risk management issue -->
* Risk Class: <!-- e.g. very low | low | medium | high | very high -->
* Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept -->
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. -->
Zeebe Epic
* DRI: @remcowesterhoud
* https://github.com/camunda/zeebe/issues/9576
|
process
|
delete selected version of a process definition value proposition statement delete process definition to free up storage declutter user interface and prevent errors user problem i have multiple versions of process definitions that are not being used anymore i cannot delete them what can cause accidentally start process in the older version a bloated database of process data takes too much space i cannot stop creation of the new process instances when there is a timer start event cycle currently there is no way to delete process definition form zeebe cluster during development i have multiple versions with bugs as i was testing the process user stories as a developer i can delete a selected version of deployed process definition and all process instances of this version of process definition operation via operate ui as a developer while using this feature i can read basic information what will be deleted as a developer i can see the progress of deletion in operations panel as a developer i can read the documentation explaining what is going to happen when i delete a version of process definition implementation notes together with process definition deletion all dependent data of this process definition version like tasks and decision instances will be deleted only instances not definitions it will affect only child processes call activity and business rule tasks nothing in parent instances with this iteration we ll deliver simplified frontend the enhancements will be covered in the iteration why this scope delete process definition has dependencies with decisions and tasks objects so it s more complex to build that frontend part is split into iterations so we re able to ship it with the simplified frontend and then enhance it to final version this is the second out of iterations to implement the whole feature of delete process and decision definition more details in iterations breakdown this section links to various sub issues tasks contributing to respective epic phase or phase results where appropriate discovery phase define phase additional security testing test for security login and monitoring failure ensure current permission management remains in effect design planning designer assigned yes assignee design brief research brief design deliverables figma file for hfw still to add iterative breakdown documentation planning risk management risk class risk treatment implement phase zeebe epic dri remcowesterhoud
| 1
|
168,916
| 14,176,142,061
|
IssuesEvent
|
2020-11-12 22:55:32
|
sugarlabs/musicblocks
|
https://api.github.com/repos/sugarlabs/musicblocks
|
closed
|
guide artwork is deprecated
|
Issue-Documentation
|
The artwork associated with the widgets in the guide (https://github.com/sugarlabs/musicblocks/blob/master/guide/README.md) is in need of updating.
- [x] drum5.svg
- [x] drum6.svg
- [x] matrix2.svg
- [x] matrix3.svg
- [x] matrix8.svg
- [x] matrix10.svg
- [x] matrix13.svg
- [x] meter2.svg
- [x] mode2.svg
- [x] mode3.svg
- [x] mode4.svg
- [x] mode5.svg
- [x] pitchslider1.svg
- [x] pitchslider2.svg
- [x] pitchslider3.svg
- [x] pitchstaircase1.svg
- [x] pitchstaircase2.svg
- [x] pitchstaircase3.svg
- [x] rhythm2.svg
- [x] rhythm3.svg
- [x] rhythm4.svg
- [x] rhythm8.svg
- [x] status2.svg
- [x] temperament2.svg
- [x] temperament3.svg
- [x] temperament4.svg
- [x] temperament5.svg
- [x] temperament6.svg
- [x] temperament7.svg
- [x] tempo1.svg
- [x] timbre2.svg
- [x] timbre3.svg
- [x] timbre4.svg
- [x] timbre5.svg
- [x] timbre6.svg
- [x] timbre6a.svg
- [x] timbre7.svg
|
1.0
|
guide artwork is deprecated - The artwork associated with the widgets in the guide (https://github.com/sugarlabs/musicblocks/blob/master/guide/README.md) is in need of updating.
- [x] drum5.svg
- [x] drum6.svg
- [x] matrix2.svg
- [x] matrix3.svg
- [x] matrix8.svg
- [x] matrix10.svg
- [x] matrix13.svg
- [x] meter2.svg
- [x] mode2.svg
- [x] mode3.svg
- [x] mode4.svg
- [x] mode5.svg
- [x] pitchslider1.svg
- [x] pitchslider2.svg
- [x] pitchslider3.svg
- [x] pitchstaircase1.svg
- [x] pitchstaircase2.svg
- [x] pitchstaircase3.svg
- [x] rhythm2.svg
- [x] rhythm3.svg
- [x] rhythm4.svg
- [x] rhythm8.svg
- [x] status2.svg
- [x] temperament2.svg
- [x] temperament3.svg
- [x] temperament4.svg
- [x] temperament5.svg
- [x] temperament6.svg
- [x] temperament7.svg
- [x] tempo1.svg
- [x] timbre2.svg
- [x] timbre3.svg
- [x] timbre4.svg
- [x] timbre5.svg
- [x] timbre6.svg
- [x] timbre6a.svg
- [x] timbre7.svg
|
non_process
|
guide artwork is deprecated the artwork associated with the widgets in the guide is in need of updating svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg svg
| 0
|
21,016
| 27,963,020,363
|
IssuesEvent
|
2023-03-24 17:03:31
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
Release 36.0.0 of arrow/arrow-flight/parquet/parquet-derive (next release after 35.0.0)
|
enhancement development-process
|
Follow on from https://github.com/apache/arrow-rs/issues/3830
- Planned Release Candidate: 2023-03-25
- Planned Release and Publish to crates.io: 2023-03-27
Items (from [dev/release/README.md](https://github.com/apache/arrow-rs/blob/master/dev/release/README.md)):
- [ ] PR to update version and CHANGELOG:
- [ ] Release candidate created:
- [ ] Release candidate approved:
- [ ] Release to crates.io:
- [ ] Make ticket for next release
See full list here:
https://github.com/apache/arrow-rs/compare/35.0.0...master
cc @alamb @tustvold @viirya @iajoiner
|
1.0
|
Release 36.0.0 of arrow/arrow-flight/parquet/parquet-derive (next release after 35.0.0) - Follow on from https://github.com/apache/arrow-rs/issues/3830
- Planned Release Candidate: 2023-03-25
- Planned Release and Publish to crates.io: 2023-03-27
Items (from [dev/release/README.md](https://github.com/apache/arrow-rs/blob/master/dev/release/README.md)):
- [ ] PR to update version and CHANGELOG:
- [ ] Release candidate created:
- [ ] Release candidate approved:
- [ ] Release to crates.io:
- [ ] Make ticket for next release
See full list here:
https://github.com/apache/arrow-rs/compare/35.0.0...master
cc @alamb @tustvold @viirya @iajoiner
|
process
|
release of arrow arrow flight parquet parquet derive next release after follow on from planned release candidate planned release and publish to crates io items from pr to update version and changelog release candidate created release candidate approved release to crates io make ticket for next release see full list here cc alamb tustvold viirya iajoiner
| 1
|
11,935
| 14,706,592,391
|
IssuesEvent
|
2021-01-04 20:07:07
|
modi-w/AutoVersionsDB
|
https://api.github.com/repos/modi-w/AutoVersionsDB
|
opened
|
The process percentage gets to 120% when a rollback is executed
|
area-Core good first issue process-ready-for-implementation type-bug up-for-grab
|
**Describe the bug**
The process percentage gets to 120% when a rollback is executed.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a syntax error in one of the script files.
2. Run sync process
3. The system should run with error and should execute the rollback step.
4. Look at the percentage of the last step, should be above 100%.
|
1.0
|
The process percentage gets to 120% when a rollback is executed - **Describe the bug**
The process percentage gets to 120% when a rollback is executed.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a syntax error in one of the script files.
2. Run sync process
3. The system should run with error and should execute the rollback step.
4. Look at the percentage of the last step, should be above 100%.
|
process
|
the process percentage gets to when a rollback is executed describe the bug the process percentage gets to when a rollback is executed to reproduce steps to reproduce the behavior create a syntax error in one of the script files run sync process the system should run with error and should execute the rollback step look at the percentage of the last step should be above
| 1
|
17,114
| 22,635,049,396
|
IssuesEvent
|
2022-06-30 18:05:04
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Cannot filter ObjectId column by "Is (not) empty" unless Field Type is "Entity Key"
|
Type:Bug Priority:P3 Database/Mongo Querying/Processor Querying/Parameters & Variables Querying/GUI Drainable:Yes
|
**Describe the bug**
Cannot filter MongoDB ObjectId column by "Is empty" or "Not empty" unless Field Type is "Entity Key".
**Workaround**: Since 0.39.0 it is now possible to create a Custom Expression with the function `isnull([column_name])`
**To Reproduce**
1. Admin > Data Model > MongoDB Sample > Orders > change the Field Type of the `_id` column () to "No semantic type"
2. Simple question > Mongo Sample > Orders > filter the `_id` column with "Is empty" or "Not empty" - query fails with error `invalid hexadecimal representation of an ObjectId: []` because it tries to filter by both null and string="", but the string comparison is not allowed.
<details><summary>Full stacktrace</summary>
```
2021-04-24 10:55:36,308 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 36,
:started_at #t "2021-04-24T10:55:35.888808+02:00[Europe/Copenhagen]",
:json_query
{:type "query",
:query {:source-table 1376, :filter ["is-empty" ["field" 20263 nil]]},
:database 36,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native nil,
:status :failed,
:class java.lang.IllegalArgumentException,
:stacktrace
["org.bson.types.ObjectId.parseHexString(ObjectId.java:550)"
"org.bson.types.ObjectId.<init>(ObjectId.java:239)"
"--> driver.mongo.query_processor$eval858$fn__860.invoke(query_processor.clj:230)"
"driver.mongo.query_processor$filter_expr.invokeStatic(query_processor.clj:327)"
"driver.mongo.query_processor$filter_expr.invoke(query_processor.clj:325)"
"driver.mongo.query_processor$eval944$fn__946.invoke(query_processor.clj:340)"
"driver.mongo.query_processor$eval1000$fn__1002.invoke(query_processor.clj:355)"
"driver.mongo.query_processor$handle_filter.invokeStatic(query_processor.clj:379)"
"driver.mongo.query_processor$handle_filter.invoke(query_processor.clj:376)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452$fn__1453$fn__1454.invoke(query_processor.clj:643)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452$fn__1453.invoke(query_processor.clj:642)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452.invoke(query_processor.clj:639)"
"driver.mongo.query_processor$mbql__GT_native.invokeStatic(query_processor.clj:674)"
"driver.mongo.query_processor$mbql__GT_native.invoke(query_processor.clj:667)"
"driver.mongo$eval2738$fn__2739.invoke(mongo.clj:220)"
"query_processor.middleware.mbql_to_native$query__GT_native_form.invokeStatic(mbql_to_native.clj:14)"
"query_processor.middleware.mbql_to_native$query__GT_native_form.invoke(mbql_to_native.clj:9)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47220.invoke(mbql_to_native.clj:22)"
"query_processor.middleware.check_features$check_features$fn__46462.invoke(check_features.clj:39)"
"query_processor.middleware.limit$limit$fn__47206.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45914.invoke(cache.clj:211)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__47466.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__49396.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45033.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41378.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__41253.invoke(annotate.clj:598)"
"query_processor.middleware.permissions$check_query_permissions$fn__46334.invoke(permissions.clj:81)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48324.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46535.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__48623.invoke(resolve_joined_fields.clj:102)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__48936.invoke(resolve_joins.clj:171)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44609.invoke(add_implicit_joins.clj:190)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47170.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__47151.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__46601.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45420.invoke(binning.clj:227)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46137.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44258.invoke(add_dimension_projections.clj:314)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44487.invoke(add_implicit_clauses.clj:147)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__49345.invoke(upgrade_field_literals.clj:40)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44772.invoke(add_source_metadata.clj:123)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__48498.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44980.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46184.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48306.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46236.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46857.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44781.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49298.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48509$fn__48513.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48509.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47097.invoke(fetch_source_query.clj:274)"
"query_processor.middleware.store$initialize_store$fn__49307$fn__49308.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__49307.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__49352.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47233.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44627.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49283.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46478.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__48395.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46418.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37956$thunk__37957.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37956.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37965$fn__37968.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37965.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:239)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:235)"
"query_processor$fn__49442$process_query_and_save_execution_BANG___49451$fn__49454.invoke(query_processor.clj:251)"
"query_processor$fn__49442$process_query_and_save_execution_BANG___49451.invoke(query_processor.clj:243)"
"query_processor$fn__49486$process_query_and_save_with_max_results_constraints_BANG___49495$fn__49498.invoke(query_processor.clj:263)"
"query_processor$fn__49486$process_query_and_save_with_max_results_constraints_BANG___49495.invoke(query_processor.clj:256)"
"api.dataset$run_query_async$fn__55699.invoke(dataset.clj:56)"
"query_processor.streaming$streaming_response_STAR_$fn__55678$fn__55679.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__55678.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16071.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "invalid hexadecimal representation of an ObjectId: []",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 1376,
:filter
[:or
[:=
[:field 20263 nil]
[:value
nil
{:base_type :type/MongoBSONID,
:effective_type :type/MongoBSONID,
:coercion_strategy nil,
:semantic_type nil,
:database_type "org.bson.types.ObjectId",
:name "_id"}]]
[:=
[:field 20263 nil]
[:value
""
{:base_type :type/MongoBSONID,
:effective_type :type/MongoBSONID,
:coercion_strategy nil,
:semantic_type nil,
:database_type "org.bson.types.ObjectId",
:name "_id"}]]],
:fields
[[:field 20263 nil]
[:field 20269 nil]
[:field 20267 nil]
[:field 20264 nil]
[:field 20268 nil]
[:field 20261 nil]
[:field 20266 nil]
[:field 20260 nil]
[:field 20265 nil]
[:field 20262 {:temporal-unit :default}]],
:limit 2000},
:database 36,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash
[-125, -26, -60, 22, -23, -32, -106, -110, 78, 16, -96, -65, -40, 7, 4, 72, -125, -88, -84, -68, 106, 43, 70, 82,
-79, -12, 27, -26, -30, -35, 12, 8]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2021-04-24 10:55:36,317 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 432.4 ms (19 DB calls) App DB connections: 0/13 Jetty threads: 3/50 (4 idle, 0 queued) (62 total active threads) Queries in flight: 0 (0 queued)
```
</details>
**Information about your Metabase Installation:**
Tested 0.36.8 thru 0.39.0.1
For history #11134
:arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
|
1.0
|
Cannot filter ObjectId column by "Is (not) empty" unless Field Type is "Entity Key" - **Describe the bug**
Cannot filter MongoDB ObjectId column by "Is empty" or "Not empty" unless Field Type is "Entity Key".
**Workaround**: Since 0.39.0 it is now possible to create a Custom Expression with the function `isnull([column_name])`
**To Reproduce**
1. Admin > Data Model > MongoDB Sample > Orders > change the Field Type of the `_id` column () to "No semantic type"
2. Simple question > Mongo Sample > Orders > filter the `_id` column with "Is empty" or "Not empty" - query fails with error `invalid hexadecimal representation of an ObjectId: []` because it tries to filter by both null and string="", but the string comparison is not allowed.
<details><summary>Full stacktrace</summary>
```
2021-04-24 10:55:36,308 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 36,
:started_at #t "2021-04-24T10:55:35.888808+02:00[Europe/Copenhagen]",
:json_query
{:type "query",
:query {:source-table 1376, :filter ["is-empty" ["field" 20263 nil]]},
:database 36,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native nil,
:status :failed,
:class java.lang.IllegalArgumentException,
:stacktrace
["org.bson.types.ObjectId.parseHexString(ObjectId.java:550)"
"org.bson.types.ObjectId.<init>(ObjectId.java:239)"
"--> driver.mongo.query_processor$eval858$fn__860.invoke(query_processor.clj:230)"
"driver.mongo.query_processor$filter_expr.invokeStatic(query_processor.clj:327)"
"driver.mongo.query_processor$filter_expr.invoke(query_processor.clj:325)"
"driver.mongo.query_processor$eval944$fn__946.invoke(query_processor.clj:340)"
"driver.mongo.query_processor$eval1000$fn__1002.invoke(query_processor.clj:355)"
"driver.mongo.query_processor$handle_filter.invokeStatic(query_processor.clj:379)"
"driver.mongo.query_processor$handle_filter.invoke(query_processor.clj:376)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452$fn__1453$fn__1454.invoke(query_processor.clj:643)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452$fn__1453.invoke(query_processor.clj:642)"
"driver.mongo.query_processor$eval1447$generate_aggregation_pipeline__1452.invoke(query_processor.clj:639)"
"driver.mongo.query_processor$mbql__GT_native.invokeStatic(query_processor.clj:674)"
"driver.mongo.query_processor$mbql__GT_native.invoke(query_processor.clj:667)"
"driver.mongo$eval2738$fn__2739.invoke(mongo.clj:220)"
"query_processor.middleware.mbql_to_native$query__GT_native_form.invokeStatic(mbql_to_native.clj:14)"
"query_processor.middleware.mbql_to_native$query__GT_native_form.invoke(mbql_to_native.clj:9)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__47220.invoke(mbql_to_native.clj:22)"
"query_processor.middleware.check_features$check_features$fn__46462.invoke(check_features.clj:39)"
"query_processor.middleware.limit$limit$fn__47206.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__45914.invoke(cache.clj:211)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__47466.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__49396.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__45033.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__41378.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__41253.invoke(annotate.clj:598)"
"query_processor.middleware.permissions$check_query_permissions$fn__46334.invoke(permissions.clj:81)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__48324.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46535.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__48623.invoke(resolve_joined_fields.clj:102)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__48936.invoke(resolve_joins.clj:171)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__44609.invoke(add_implicit_joins.clj:190)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47170.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__47151.invoke(format_rows.clj:74)"
"query_processor.middleware.desugar$desugar$fn__46601.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__45420.invoke(binning.clj:227)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__46137.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__44258.invoke(add_dimension_projections.clj:314)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__44487.invoke(add_implicit_clauses.clj:147)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__49345.invoke(upgrade_field_literals.clj:40)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__44772.invoke(add_source_metadata.clj:123)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__48498.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__44980.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__46184.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48306.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46236.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__46857.invoke(expand_macros.clj:155)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__44781.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__49298.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48509$fn__48513.invoke(resolve_database_and_driver.clj:31)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__48509.invoke(resolve_database_and_driver.clj:25)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__47097.invoke(fetch_source_query.clj:274)"
"query_processor.middleware.store$initialize_store$fn__49307$fn__49308.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__49307.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__49352.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__47233.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__44627.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49283.invoke(results_metadata.clj:147)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__46478.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__48395.invoke(process_userland_query.clj:135)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__46418.invoke(catch_exceptions.clj:173)"
"query_processor.reducible$async_qp$qp_STAR___37956$thunk__37957.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___37956.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___37965$fn__37968.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___37965.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:239)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:235)"
"query_processor$fn__49442$process_query_and_save_execution_BANG___49451$fn__49454.invoke(query_processor.clj:251)"
"query_processor$fn__49442$process_query_and_save_execution_BANG___49451.invoke(query_processor.clj:243)"
"query_processor$fn__49486$process_query_and_save_with_max_results_constraints_BANG___49495$fn__49498.invoke(query_processor.clj:263)"
"query_processor$fn__49486$process_query_and_save_with_max_results_constraints_BANG___49495.invoke(query_processor.clj:256)"
"api.dataset$run_query_async$fn__55699.invoke(dataset.clj:56)"
"query_processor.streaming$streaming_response_STAR_$fn__55678$fn__55679.invoke(streaming.clj:72)"
"query_processor.streaming$streaming_response_STAR_$fn__55678.invoke(streaming.clj:71)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$fn__16071.invoke(streaming_response.clj:84)"],
:context :ad-hoc,
:error "invalid hexadecimal representation of an ObjectId: []",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 1376,
:filter
[:or
[:=
[:field 20263 nil]
[:value
nil
{:base_type :type/MongoBSONID,
:effective_type :type/MongoBSONID,
:coercion_strategy nil,
:semantic_type nil,
:database_type "org.bson.types.ObjectId",
:name "_id"}]]
[:=
[:field 20263 nil]
[:value
""
{:base_type :type/MongoBSONID,
:effective_type :type/MongoBSONID,
:coercion_strategy nil,
:semantic_type nil,
:database_type "org.bson.types.ObjectId",
:name "_id"}]]],
:fields
[[:field 20263 nil]
[:field 20269 nil]
[:field 20267 nil]
[:field 20264 nil]
[:field 20268 nil]
[:field 20261 nil]
[:field 20266 nil]
[:field 20260 nil]
[:field 20265 nil]
[:field 20262 {:temporal-unit :default}]],
:limit 2000},
:database 36,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash
[-125, -26, -60, 22, -23, -32, -106, -110, 78, 16, -96, -65, -40, 7, 4, 72, -125, -88, -84, -68, 106, 43, 70, 82,
-79, -12, 27, -26, -30, -35, 12, 8]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2021-04-24 10:55:36,317 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 432.4 ms (19 DB calls) App DB connections: 0/13 Jetty threads: 3/50 (4 idle, 0 queued) (62 total active threads) Queries in flight: 0 (0 queued)
```
</details>
**Information about your Metabase Installation:**
Tested 0.36.8 thru 0.39.0.1
For history #11134
:arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
|
process
|
cannot filter objectid column by is not empty unless field type is entity key describe the bug cannot filter mongodb objectid column by is empty or not empty unless field type is entity key workaround since it is now possible to create a custom expression with the function isnull to reproduce admin data model mongodb sample orders change the field type of the id column to no semantic type simple question mongo sample orders filter the id column with is empty or not empty query fails with error invalid hexadecimal representation of an objectid because it tries to filter by both null and string but the string comparison is not allowed full stacktrace error middleware catch exceptions error processing query null database id started at t json query type query query source table filter database parameters middleware js int to string true add default userland constraints true native nil status failed class java lang illegalargumentexception stacktrace org bson types objectid parsehexstring objectid java org bson types objectid objectid java driver mongo query processor fn invoke query processor clj driver mongo query processor filter expr invokestatic query processor clj driver mongo query processor filter expr invoke query processor clj driver mongo query processor fn invoke query processor clj driver mongo query processor fn invoke query processor clj driver mongo query processor handle filter invokestatic query processor clj driver mongo query processor handle filter invoke query processor clj driver mongo query processor generate aggregation pipeline fn fn invoke query processor clj driver mongo query processor generate aggregation pipeline fn invoke query processor clj driver mongo query processor generate aggregation pipeline invoke query processor clj driver mongo query processor mbql gt native invokestatic query processor clj driver mongo query processor mbql gt native invoke query processor clj driver mongo fn invoke mongo clj query processor middleware mbql to native query gt native form invokestatic mbql to native clj query processor middleware mbql to native query gt native form invoke mbql to native clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async fn invoke streaming response clj context ad hoc error invalid hexadecimal representation of an objectid row count running time preprocessed type query query source table filter or value nil base type type mongobsonid effective type type mongobsonid coercion strategy nil semantic type nil database type org bson types objectid name id value base type type mongobsonid effective type type mongobsonid coercion strategy nil semantic type nil database type org bson types objectid name id fields limit database middleware js int to string true add default userland constraints true info executed by context ad hoc nested false query hash constraints max results max results bare rows data rows cols debug middleware log post api dataset ms db calls app db connections jetty threads idle queued total active threads queries in flight queued information about your metabase installation tested thru for history arrow down please click the reaction instead of leaving a or update comment
| 1
|
8,110
| 11,300,983,657
|
IssuesEvent
|
2020-01-17 14:43:59
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
too many penetration peg terms
|
multi-species process
|
GO:0075057 initiation of symbiont penetration peg
GO:0075058 modulation of symbiont penetration peg initiation
GO:0075060 negative regulation of symbiont penetration peg initiation
GO:0075059 positive regulation of symbiont penetration peg initiation
GO:0075053 formation of symbiont penetration peg for entry into host
GO:0075054 modulation of symbiont penetration peg formation for entry into host
GO:0075055 positive regulation of symbiont penetration peg formation for entry into host
GO:0075056 negative regulation of symbiont penetration peg formation for entry into host
please merge "initiation" into formation for entry into host
You only form a penetration peg for entry into host.
We have 2 sets of terms for forming it...
|
1.0
|
too many penetration peg terms -
GO:0075057 initiation of symbiont penetration peg
GO:0075058 modulation of symbiont penetration peg initiation
GO:0075060 negative regulation of symbiont penetration peg initiation
GO:0075059 positive regulation of symbiont penetration peg initiation
GO:0075053 formation of symbiont penetration peg for entry into host
GO:0075054 modulation of symbiont penetration peg formation for entry into host
GO:0075055 positive regulation of symbiont penetration peg formation for entry into host
GO:0075056 negative regulation of symbiont penetration peg formation for entry into host
please merge "initiation" into formation for entry into host
You only form a penetration peg for entry into host.
We have 2 sets of terms for forming it...
|
process
|
too many penetration peg terms go initiation of symbiont penetration peg go modulation of symbiont penetration peg initiation go negative regulation of symbiont penetration peg initiation go positive regulation of symbiont penetration peg initiation go formation of symbiont penetration peg for entry into host go modulation of symbiont penetration peg formation for entry into host go positive regulation of symbiont penetration peg formation for entry into host go negative regulation of symbiont penetration peg formation for entry into host please merge initiation into formation for entry into host you only form a penetration peg for entry into host we have sets of terms for forming it
| 1
|
8,507
| 11,686,393,434
|
IssuesEvent
|
2020-03-05 10:47:10
|
tzaiyang/blog-comments
|
https://api.github.com/repos/tzaiyang/blog-comments
|
opened
|
进程与线程
|
/2017/03/05/Processes-and-Threads/ Gitalk
|
https://tzaiyang.me/2017/03/05/Processes-and-Threads/
基本概念进程是对运行时程序的封装,是系统进行资源调度和分配的的基本单位,实现了操作系统的并发; 线程是进程的子任务,是CPU调度和分派的基本单位,用于保证程序的实时性,实现进程内部的并发;线程是操作系统可识别的最小执行和调度单位。每个线程都独自占用一个虚拟处理器:独自的寄存器组,指令计数器和处理器状态。每个线程完成不同的任务,但是共享同一地址空间(也就是同样的动态内存,映射文件,目标代码等等),打
|
1.0
|
进程与线程 - https://tzaiyang.me/2017/03/05/Processes-and-Threads/
基本概念进程是对运行时程序的封装,是系统进行资源调度和分配的的基本单位,实现了操作系统的并发; 线程是进程的子任务,是CPU调度和分派的基本单位,用于保证程序的实时性,实现进程内部的并发;线程是操作系统可识别的最小执行和调度单位。每个线程都独自占用一个虚拟处理器:独自的寄存器组,指令计数器和处理器状态。每个线程完成不同的任务,但是共享同一地址空间(也就是同样的动态内存,映射文件,目标代码等等),打
|
process
|
进程与线程 基本概念进程是对运行时程序的封装,是系统进行资源调度和分配的的基本单位,实现了操作系统的并发; 线程是进程的子任务,是cpu调度和分派的基本单位,用于保证程序的实时性,实现进程内部的并发;线程是操作系统可识别的最小执行和调度单位。每个线程都独自占用一个虚拟处理器:独自的寄存器组,指令计数器和处理器状态。每个线程完成不同的任务,但是共享同一地址空间(也就是同样的动态内存,映射文件,目标代码等等),打
| 1
|
268,306
| 20,266,669,188
|
IssuesEvent
|
2022-02-15 12:44:37
|
fga-eps-mds/Projeto01
|
https://api.github.com/repos/fga-eps-mds/Projeto01
|
closed
|
Sprint 2 - Documentação - Requisitos Funcionais e Não Funcionais
|
documentation Grupo 3 Grupo 1 Grupo 4
|
# Descrição
Definir os requisitos do projeto, tanto funcionais como não funcionais.
# Tarefas
Realizar os seguintes passos:
- [x] Definir os Requisitos Funcionais.
- [x] Definir os Requisitos Não Funcionais.
- [ ] Todas as equipes revisarem e estarem de acordo com todos os requisitos.
# Critérios de aceitação
- [ ] Postagem do Documento de Requisitos.
- [ ] Caso seja necessário explicar os Requisitos.
|
1.0
|
Sprint 2 - Documentação - Requisitos Funcionais e Não Funcionais - # Descrição
Definir os requisitos do projeto, tanto funcionais como não funcionais.
# Tarefas
Realizar os seguintes passos:
- [x] Definir os Requisitos Funcionais.
- [x] Definir os Requisitos Não Funcionais.
- [ ] Todas as equipes revisarem e estarem de acordo com todos os requisitos.
# Critérios de aceitação
- [ ] Postagem do Documento de Requisitos.
- [ ] Caso seja necessário explicar os Requisitos.
|
non_process
|
sprint documentação requisitos funcionais e não funcionais descrição definir os requisitos do projeto tanto funcionais como não funcionais tarefas realizar os seguintes passos definir os requisitos funcionais definir os requisitos não funcionais todas as equipes revisarem e estarem de acordo com todos os requisitos critérios de aceitação postagem do documento de requisitos caso seja necessário explicar os requisitos
| 0
|
7,752
| 10,866,411,238
|
IssuesEvent
|
2019-11-14 21:11:19
|
pacificclimate/climate-explorer-data-prep
|
https://api.github.com/repos/pacificclimate/climate-explorer-data-prep
|
opened
|
Fill in missing HadGEM2 climatologies
|
process new data
|
nchelpers has been updated to handle the fact that HadGEM2 datasets stop on December 30, 2099 instead of December 30 2100. It is not possible to generate some missing degree day and climdex climatologies.
|
1.0
|
Fill in missing HadGEM2 climatologies - nchelpers has been updated to handle the fact that HadGEM2 datasets stop on December 30, 2099 instead of December 30 2100. It is not possible to generate some missing degree day and climdex climatologies.
|
process
|
fill in missing climatologies nchelpers has been updated to handle the fact that datasets stop on december instead of december it is not possible to generate some missing degree day and climdex climatologies
| 1
|
5,546
| 8,392,998,463
|
IssuesEvent
|
2018-10-09 19:13:59
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Logging: 'DeadlineExceded' when tearing down logger in systests
|
api: logging flaky testing type: process
|
From: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/8431 (multiple occurences, only first shown):
```python
__________________________ TestLogging.test_log_text ___________________________
self = <test_system.TestLogging testMethod=test_log_text>
def tearDown(self):
retry = RetryErrors((NotFound, TooManyRequests), max_tries=9)
for doomed in self.to_delete:
try:
> retry(doomed.delete)()
tests/system/test_system.py:110:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../test_utils/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/logging/logger.py:353: in delete
client.logging_api.logger_delete(self.project, self.name)
google/cloud/logging/_gapic.py:136: in logger_delete
self._gapic_api.delete_log(path)
google/cloud/logging_v2/gapic/logging_service_v2_client.py:222: in delete_log
request, retry=retry, timeout=timeout, metadata=metadata)
../api_core/google/api_core/gapic_v1/method.py:139: in __call__
return wrapped_func(*args, **kwargs)
../api_core/google/api_core/retry.py:260: in retry_wrapped_func
on_error=on_error,
../api_core/google/api_core/retry.py:195: in retry_target
last_exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = RetryError(u'Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1c5ba48>',)
from_value = InternalServerError('Internal error encountered.',)
def raise_from(value, from_value):
> raise value
E RetryError: Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1c5ba48>, last exception: 500 Internal error encountered.
../.nox/sys-2-7/lib/python2.7/site-packages/six.py:737: RetryError
___________________ TestLogging.test_log_text_with_resource ____________________
self = <test_system.TestLogging testMethod=test_log_text_with_resource>
def tearDown(self):
retry = RetryErrors((NotFound, TooManyRequests), max_tries=9)
for doomed in self.to_delete:
try:
> retry(doomed.delete)()
tests/system/test_system.py:110:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../test_utils/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/logging/logger.py:353: in delete
client.logging_api.logger_delete(self.project, self.name)
google/cloud/logging/_gapic.py:136: in logger_delete
self._gapic_api.delete_log(path)
google/cloud/logging_v2/gapic/logging_service_v2_client.py:222: in delete_log
request, retry=retry, timeout=timeout, metadata=metadata)
../api_core/google/api_core/gapic_v1/method.py:139: in __call__
return wrapped_func(*args, **kwargs)
../api_core/google/api_core/retry.py:260: in retry_wrapped_func
on_error=on_error,
../api_core/google/api_core/retry.py:195: in retry_target
last_exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = RetryError(u'Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1b09a48>',)
from_value = DeadlineExceeded('Deadline Exceeded',)
def raise_from(value, from_value):
> raise value
E RetryError: Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1b09a48>, last exception: 504 Deadline Exceeded
../.nox/sys-2-7/lib/python2.7/site-packages/six.py:737: RetryError
```
|
1.0
|
Logging: 'DeadlineExceded' when tearing down logger in systests - From: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/8431 (multiple occurences, only first shown):
```python
__________________________ TestLogging.test_log_text ___________________________
self = <test_system.TestLogging testMethod=test_log_text>
def tearDown(self):
retry = RetryErrors((NotFound, TooManyRequests), max_tries=9)
for doomed in self.to_delete:
try:
> retry(doomed.delete)()
tests/system/test_system.py:110:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../test_utils/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/logging/logger.py:353: in delete
client.logging_api.logger_delete(self.project, self.name)
google/cloud/logging/_gapic.py:136: in logger_delete
self._gapic_api.delete_log(path)
google/cloud/logging_v2/gapic/logging_service_v2_client.py:222: in delete_log
request, retry=retry, timeout=timeout, metadata=metadata)
../api_core/google/api_core/gapic_v1/method.py:139: in __call__
return wrapped_func(*args, **kwargs)
../api_core/google/api_core/retry.py:260: in retry_wrapped_func
on_error=on_error,
../api_core/google/api_core/retry.py:195: in retry_target
last_exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = RetryError(u'Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1c5ba48>',)
from_value = InternalServerError('Internal error encountered.',)
def raise_from(value, from_value):
> raise value
E RetryError: Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1c5ba48>, last exception: 500 Internal error encountered.
../.nox/sys-2-7/lib/python2.7/site-packages/six.py:737: RetryError
___________________ TestLogging.test_log_text_with_resource ____________________
self = <test_system.TestLogging testMethod=test_log_text_with_resource>
def tearDown(self):
retry = RetryErrors((NotFound, TooManyRequests), max_tries=9)
for doomed in self.to_delete:
try:
> retry(doomed.delete)()
tests/system/test_system.py:110:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../test_utils/test_utils/retry.py:95: in wrapped_function
return to_wrap(*args, **kwargs)
google/cloud/logging/logger.py:353: in delete
client.logging_api.logger_delete(self.project, self.name)
google/cloud/logging/_gapic.py:136: in logger_delete
self._gapic_api.delete_log(path)
google/cloud/logging_v2/gapic/logging_service_v2_client.py:222: in delete_log
request, retry=retry, timeout=timeout, metadata=metadata)
../api_core/google/api_core/gapic_v1/method.py:139: in __call__
return wrapped_func(*args, **kwargs)
../api_core/google/api_core/retry.py:260: in retry_wrapped_func
on_error=on_error,
../api_core/google/api_core/retry.py:195: in retry_target
last_exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = RetryError(u'Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1b09a48>',)
from_value = DeadlineExceeded('Deadline Exceeded',)
def raise_from(value, from_value):
> raise value
E RetryError: Deadline of 90.0s exceeded while calling <functools.partial object at 0x7f0be1b09a48>, last exception: 504 Deadline Exceeded
../.nox/sys-2-7/lib/python2.7/site-packages/six.py:737: RetryError
```
|
process
|
logging deadlineexceded when tearing down logger in systests from multiple occurences only first shown python testlogging test log text self def teardown self retry retryerrors notfound toomanyrequests max tries for doomed in self to delete try retry doomed delete tests system test system py test utils test utils retry py in wrapped function return to wrap args kwargs google cloud logging logger py in delete client logging api logger delete self project self name google cloud logging gapic py in logger delete self gapic api delete log path google cloud logging gapic logging service client py in delete log request retry retry timeout timeout metadata metadata api core google api core gapic method py in call return wrapped func args kwargs api core google api core retry py in retry wrapped func on error on error api core google api core retry py in retry target last exc value retryerror u deadline of exceeded while calling from value internalservererror internal error encountered def raise from value from value raise value e retryerror deadline of exceeded while calling last exception internal error encountered nox sys lib site packages six py retryerror testlogging test log text with resource self def teardown self retry retryerrors notfound toomanyrequests max tries for doomed in self to delete try retry doomed delete tests system test system py test utils test utils retry py in wrapped function return to wrap args kwargs google cloud logging logger py in delete client logging api logger delete self project self name google cloud logging gapic py in logger delete self gapic api delete log path google cloud logging gapic logging service client py in delete log request retry retry timeout timeout metadata metadata api core google api core gapic method py in call return wrapped func args kwargs api core google api core retry py in retry wrapped func on error on error api core google api core retry py in retry target last exc value retryerror u deadline of exceeded while calling from value deadlineexceeded deadline exceeded def raise from value from value raise value e retryerror deadline of exceeded while calling last exception deadline exceeded nox sys lib site packages six py retryerror
| 1
|
215,135
| 16,592,743,745
|
IssuesEvent
|
2021-06-01 09:42:00
|
scikit-learn/scikit-learn
|
https://api.github.com/repos/scikit-learn/scikit-learn
|
closed
|
Sparse PCA optimization task
|
Documentation module:decomposition
|
#### Describe the issue linked to the documentation
1) In user guide for [Sparse PCA](https://scikit-learn.org/stable/modules/decomposition.html#sparse-principal-components-analysis-sparsepca-and-minibatchsparsepca) and [Dictionary learning](https://scikit-learn.org/stable/modules/decomposition.html#generic-dictionary-learning) we minimize 1/2 ||X-UV||_2^2 + \alpha ||V||_1. Do we really use Euclidian norm in the first term? It seems that it is Frobenius norm, not Euclidian.
2) Also I wonder do we really penalize matrix V in sparse PCA? Because it seems for me more logical to penalize V^T because L_1 norm penalizes columns (and principal component directions are located in rows of matrix V).
#### Suggest a potential alternative/fix
1) Change ||X-UV||_2^2 to ||X-UV||\_{\text{Fro}}^2 in documentation for Sparse PCA and Dictionary learning.
|
1.0
|
Sparse PCA optimization task - #### Describe the issue linked to the documentation
1) In user guide for [Sparse PCA](https://scikit-learn.org/stable/modules/decomposition.html#sparse-principal-components-analysis-sparsepca-and-minibatchsparsepca) and [Dictionary learning](https://scikit-learn.org/stable/modules/decomposition.html#generic-dictionary-learning) we minimize 1/2 ||X-UV||_2^2 + \alpha ||V||_1. Do we really use Euclidian norm in the first term? It seems that it is Frobenius norm, not Euclidian.
2) Also I wonder do we really penalize matrix V in sparse PCA? Because it seems for me more logical to penalize V^T because L_1 norm penalizes columns (and principal component directions are located in rows of matrix V).
#### Suggest a potential alternative/fix
1) Change ||X-UV||_2^2 to ||X-UV||\_{\text{Fro}}^2 in documentation for Sparse PCA and Dictionary learning.
|
non_process
|
sparse pca optimization task describe the issue linked to the documentation in user guide for and we minimize x uv alpha v do we really use euclidian norm in the first term it seems that it is frobenius norm not euclidian also i wonder do we really penalize matrix v in sparse pca because it seems for me more logical to penalize v t because l norm penalizes columns and principal component directions are located in rows of matrix v suggest a potential alternative fix change x uv to x uv text fro in documentation for sparse pca and dictionary learning
| 0
|
382
| 2,823,574,407
|
IssuesEvent
|
2015-05-21 09:39:50
|
austundag/testing
|
https://api.github.com/repos/austundag/testing
|
closed
|
Add a grunt task to generate a http-server'able directory from ADK/eHMP-UI
|
enhancement in process
|
It should possible to serve the eHMP-UI static web pages from local machine using node http-server. Circumvents VM creation/update in development environment.
|
1.0
|
Add a grunt task to generate a http-server'able directory from ADK/eHMP-UI - It should possible to serve the eHMP-UI static web pages from local machine using node http-server. Circumvents VM creation/update in development environment.
|
process
|
add a grunt task to generate a http server able directory from adk ehmp ui it should possible to serve the ehmp ui static web pages from local machine using node http server circumvents vm creation update in development environment
| 1
|
21,058
| 28,005,934,657
|
IssuesEvent
|
2023-03-27 15:15:35
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Serialization v1 errors when trying to load an instance with actions
|
Type:Bug Priority:P1 .Regression Operation/Serialization .Team/QueryProcessor :hammer_and_wrench:
|
**Describe the bug**
If an instance has actions in dashboards, the load fails because of a FK constraint.
**To Reproduce**
1. New instance, connect to PostgresDB, turn on actions, create a model, create an action, add to a dashboard.
2. Dump this instance
3. Try load it in a fresh instance
4. See error
**Expected behavior**
As Serdes v1 doesn't support Actions, we should skip it.
**Logs**
```
FOREIGN KEY(ACTION_ID) REFERENCES PUBLIC.ACTION(ID) (15)"; SQL statement:
INSERT INTO "REPORT_DASHBOARDCARD" ("SIZE_X", "ACTION_ID", "UPDATED_AT", "COL", "PARAMETER_MAPPINGS", "CARD_ID", "ENTITY_ID", "VISUALIZATION_SETTINGS", "SIZE_Y", "DASHBOARD_ID", "CREATED_AT", "ROW") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?) [23506-212]
clojure.lang.ExceptionInfo: Referential integrity constraint violation: "FK_REPORT_DASHBOARDCARD_REF_ACTION_ID: PUBLIC.REPORT_DASHBOARDCARD FOREIGN KEY(ACTION_ID) REFERENCES PUBLIC.ACTION(ID) (15)"; SQL statement:
INSERT INTO "REPORT_DASHBOARDCARD" ("SIZE_X", "ACTION_ID", "UPDATED_AT", "COL", "PARAMETER_MAPPINGS", "CARD_ID", "ENTITY_ID", "VISUALIZATION_SETTINGS", "SIZE_Y", "DASHBOARD_ID", "CREATED_AT", "ROW") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?) [23506-212] {:toucan2/context-trace [["execute SQL with class com.mchange.v2.c3p0.impl.NewProxyConnection" {:toucan2.jdbc.query/sql-args ["INSERT INTO \"REPORT_DASHBOARDCARD\" (\"SIZE_X\", \"ACTION_ID\", \"UPDATED_AT\", \"COL\", \"PARAMETER_MAPPINGS\", \"CARD_ID\", \"ENTITY_ID\", \"VISUALIZATION_SETTINGS\", \"SIZE_Y\", \"DASHBOARD_ID\", \"CREATED_AT\", \"ROW\") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?)" 2 15 15 "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]" "2wlq06krPsREPgSaVdWet" "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}" 1 1 0]}] ["resolve connection" {:toucan2.connection/connectable org.h2.jdbc.JdbcConnection}] ["resolve connection" {:toucan2.connection/connectable nil}] {:toucan2.pipeline/rf #object[clojure.core$map$fn__5931$fn__5932 0x5c5df333 "clojure.core$map$fn__5931$fn__5932@5c5df333"]} ["with compiled query" {:toucan2.pipeline/compiled-query ["INSERT INTO \"REPORT_DASHBOARDCARD\" (\"SIZE_X\", \"ACTION_ID\", \"UPDATED_AT\", \"COL\", \"PARAMETER_MAPPINGS\", \"CARD_ID\", \"ENTITY_ID\", \"VISUALIZATION_SETTINGS\", \"SIZE_Y\", \"DASHBOARD_ID\", \"CREATED_AT\", \"ROW\") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?)" 2 15 15 "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]" "2wlq06krPsREPgSaVdWet" "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}" 1 1 0]}] ["with built query" {:toucan2.pipeline/built-query {:insert-into [:report_dashboardcard], :values ((toucan2.instance/instance :metabase.models.dashboard-card/DashboardCard {:size_x 2, :action_id 15, :updated_at [:metabase.util.honey-sql-2/typed :%now {:metabase.util.honeysql-extensions/database-type "timestamp"}], :col 15, :parameter_mappings "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]", :card_id nil, :entity_id "2wlq06krPsREPgSaVdWet", :visualization_settings "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}", :size_y 1, :dashboard_id 1, :created_at [:metabase.util.honey-sql-2/typed :%now {:metabase.util.honeysql-extensions/database-type "timestamp"}], :row 0}))}}] ["resolve connection" {:toucan2.connection/connectable metabase.db.connection.ApplicationDB}] ["resolve connection" {:toucan2.connection/connectable :default}] ["resolve connection" {:toucan2.connection/connectable nil}] ["with resolved query" {:toucan2.pipeline/resolved-query {}}] ["with parsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/parsed-args {:rows [#ordered/map ([:visualization_settings #ordered/map ([:actionDisplayType "button"] [:virtual_card #ordered/map ([:name nil] [:display "action"] [:visualization_settings #ordered/map nil] [:dataset_query #ordered/map nil] [:archived false])] [:button.label "Update plan"])] [:parameter_mappings (#ordered/map ([:parameter_id "453696fb"] [:target [:variable [:template-tag "id"]]]))] [:card_id nil] [:size_x 2] [:size_y 1] [:col 15] [:row 0] [:action_id 15] [:dashboard_id 1])]}}] ["with model" {:toucan2.pipeline/model :metabase.models.dashboard-card/DashboardCard}] ["with unparsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/unparsed-args (:metabase.models.dashboard-card/DashboardCard #ordered/map ([:visualization_settings #ordered/map ([:actionDisplayType "button"] [:virtual_card #ordered/map ([:name nil] [:display "action"] [:visualization_settings #ordered/map nil] [:dataset_query #ordered/map nil] [:archived false])] [:button.label "Update plan"])] [:parameter_mappings (#ordered/map ([:parameter_id "453696fb"] [:target [:variable [:template-tag "id"]]]))] [:card_id nil] [:size_x 2] [:size_y 1] [:col 15] [:row 0] [:action_id 15] [:dashboard_id 1]))}]]}
```
1.46.0-RC4
**Severity**
P1 creates errors that shouldn't exist, impacting the `--on-error` flag in serialization.
|
1.0
|
Serialization v1 errors when trying to load an instance with actions - **Describe the bug**
If an instance has actions in dashboards, the load fails because of a FK constraint.
**To Reproduce**
1. New instance, connect to PostgresDB, turn on actions, create a model, create an action, add to a dashboard.
2. Dump this instance
3. Try load it in a fresh instance
4. See error
**Expected behavior**
As Serdes v1 doesn't support Actions, we should skip it.
**Logs**
```
FOREIGN KEY(ACTION_ID) REFERENCES PUBLIC.ACTION(ID) (15)"; SQL statement:
INSERT INTO "REPORT_DASHBOARDCARD" ("SIZE_X", "ACTION_ID", "UPDATED_AT", "COL", "PARAMETER_MAPPINGS", "CARD_ID", "ENTITY_ID", "VISUALIZATION_SETTINGS", "SIZE_Y", "DASHBOARD_ID", "CREATED_AT", "ROW") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?) [23506-212]
clojure.lang.ExceptionInfo: Referential integrity constraint violation: "FK_REPORT_DASHBOARDCARD_REF_ACTION_ID: PUBLIC.REPORT_DASHBOARDCARD FOREIGN KEY(ACTION_ID) REFERENCES PUBLIC.ACTION(ID) (15)"; SQL statement:
INSERT INTO "REPORT_DASHBOARDCARD" ("SIZE_X", "ACTION_ID", "UPDATED_AT", "COL", "PARAMETER_MAPPINGS", "CARD_ID", "ENTITY_ID", "VISUALIZATION_SETTINGS", "SIZE_Y", "DASHBOARD_ID", "CREATED_AT", "ROW") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?) [23506-212] {:toucan2/context-trace [["execute SQL with class com.mchange.v2.c3p0.impl.NewProxyConnection" {:toucan2.jdbc.query/sql-args ["INSERT INTO \"REPORT_DASHBOARDCARD\" (\"SIZE_X\", \"ACTION_ID\", \"UPDATED_AT\", \"COL\", \"PARAMETER_MAPPINGS\", \"CARD_ID\", \"ENTITY_ID\", \"VISUALIZATION_SETTINGS\", \"SIZE_Y\", \"DASHBOARD_ID\", \"CREATED_AT\", \"ROW\") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?)" 2 15 15 "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]" "2wlq06krPsREPgSaVdWet" "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}" 1 1 0]}] ["resolve connection" {:toucan2.connection/connectable org.h2.jdbc.JdbcConnection}] ["resolve connection" {:toucan2.connection/connectable nil}] {:toucan2.pipeline/rf #object[clojure.core$map$fn__5931$fn__5932 0x5c5df333 "clojure.core$map$fn__5931$fn__5932@5c5df333"]} ["with compiled query" {:toucan2.pipeline/compiled-query ["INSERT INTO \"REPORT_DASHBOARDCARD\" (\"SIZE_X\", \"ACTION_ID\", \"UPDATED_AT\", \"COL\", \"PARAMETER_MAPPINGS\", \"CARD_ID\", \"ENTITY_ID\", \"VISUALIZATION_SETTINGS\", \"SIZE_Y\", \"DASHBOARD_ID\", \"CREATED_AT\", \"ROW\") VALUES (?, ?, NOW(), ?, ?, NULL, ?, ?, ?, ?, NOW(), ?)" 2 15 15 "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]" "2wlq06krPsREPgSaVdWet" "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}" 1 1 0]}] ["with built query" {:toucan2.pipeline/built-query {:insert-into [:report_dashboardcard], :values ((toucan2.instance/instance :metabase.models.dashboard-card/DashboardCard {:size_x 2, :action_id 15, :updated_at [:metabase.util.honey-sql-2/typed :%now {:metabase.util.honeysql-extensions/database-type "timestamp"}], :col 15, :parameter_mappings "[{\"parameter_id\":\"453696fb\",\"target\":[\"variable\",[\"template-tag\",\"id\"]]}]", :card_id nil, :entity_id "2wlq06krPsREPgSaVdWet", :visualization_settings "{\"actionDisplayType\":\"button\",\"virtual_card\":{\"name\":null,\"display\":\"action\",\"visualization_settings\":{},\"dataset_query\":{},\"archived\":false},\"button.label\":\"Update plan\"}", :size_y 1, :dashboard_id 1, :created_at [:metabase.util.honey-sql-2/typed :%now {:metabase.util.honeysql-extensions/database-type "timestamp"}], :row 0}))}}] ["resolve connection" {:toucan2.connection/connectable metabase.db.connection.ApplicationDB}] ["resolve connection" {:toucan2.connection/connectable :default}] ["resolve connection" {:toucan2.connection/connectable nil}] ["with resolved query" {:toucan2.pipeline/resolved-query {}}] ["with parsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/parsed-args {:rows [#ordered/map ([:visualization_settings #ordered/map ([:actionDisplayType "button"] [:virtual_card #ordered/map ([:name nil] [:display "action"] [:visualization_settings #ordered/map nil] [:dataset_query #ordered/map nil] [:archived false])] [:button.label "Update plan"])] [:parameter_mappings (#ordered/map ([:parameter_id "453696fb"] [:target [:variable [:template-tag "id"]]]))] [:card_id nil] [:size_x 2] [:size_y 1] [:col 15] [:row 0] [:action_id 15] [:dashboard_id 1])]}}] ["with model" {:toucan2.pipeline/model :metabase.models.dashboard-card/DashboardCard}] ["with unparsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/unparsed-args (:metabase.models.dashboard-card/DashboardCard #ordered/map ([:visualization_settings #ordered/map ([:actionDisplayType "button"] [:virtual_card #ordered/map ([:name nil] [:display "action"] [:visualization_settings #ordered/map nil] [:dataset_query #ordered/map nil] [:archived false])] [:button.label "Update plan"])] [:parameter_mappings (#ordered/map ([:parameter_id "453696fb"] [:target [:variable [:template-tag "id"]]]))] [:card_id nil] [:size_x 2] [:size_y 1] [:col 15] [:row 0] [:action_id 15] [:dashboard_id 1]))}]]}
```
1.46.0-RC4
**Severity**
P1 creates errors that shouldn't exist, impacting the `--on-error` flag in serialization.
|
process
|
serialization errors when trying to load an instance with actions describe the bug if an instance has actions in dashboards the load fails because of a fk constraint to reproduce new instance connect to postgresdb turn on actions create a model create an action add to a dashboard dump this instance try load it in a fresh instance see error expected behavior as serdes doesn t support actions we should skip it logs foreign key action id references public action id sql statement insert into report dashboardcard size x action id updated at col parameter mappings card id entity id visualization settings size y dashboard id created at row values now null now clojure lang exceptioninfo referential integrity constraint violation fk report dashboardcard ref action id public report dashboardcard foreign key action id references public action id sql statement insert into report dashboardcard size x action id updated at col parameter mappings card id entity id visualization settings size y dashboard id created at row values now null now context trace actiondisplaytype button virtual card name null display action visualization settings dataset query archived false button label update plan pipeline rf object actiondisplaytype button virtual card name null display action visualization settings dataset query archived false button label update plan values instance instance metabase models dashboard card dashboardcard size x action id updated at col parameter mappings card id nil entity id visualization settings actiondisplaytype button virtual card name null display action visualization settings dataset query archived false button label update plan size y dashboard id created at row severity creates errors that shouldn t exist impacting the on error flag in serialization
| 1
|
12,303
| 14,857,695,680
|
IssuesEvent
|
2021-01-18 15:46:03
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
opened
|
Manage Schemas in an external repository
|
team:data processing
|
### Description
Our current management of 'native' log types is bound to the releases of Panther itself.
We want to manage our schemas in the same way we manage our rules, in a repository.
### RFC
[Under review](https://github.com/panther-labs/panther-enterprise/pull/1914)
### Designs
N/A
### Acceptance Criteria
Backend:
- [ ] LogTypesAPI can handle both 'custom' and 'managed' schemas
- [ ] GitHub repository to store schemas with release and CI automations
- [ ] Panther can get release version from GitHub
- [ ] LogTypesAPI has `CheckSchemaUpdates` and `UpdateManagedSchemas` endpoints
- [ ] Panther defines a minimum version compatibility for every release and udpates schemas if needed during deployment
Frontend:
- [ ] Filters in the schema listing
- [ ] Schema listing page displays both Managed and Custom schemas
- [ ] Clone And Edit functionality on managed schemas
- [ ] Wording follows the concept of 'unified' schemas handling
- [ ] Details and listing view includes 'Managed' and 'Disabled' info.
- [ ] Settings section has a 'Schema Updates' section
A concise list of specific user stories that qualify this story as done and tell the user journey.
|
1.0
|
Manage Schemas in an external repository - ### Description
Our current management of 'native' log types is bound to the releases of Panther itself.
We want to manage our schemas in the same way we manage our rules, in a repository.
### RFC
[Under review](https://github.com/panther-labs/panther-enterprise/pull/1914)
### Designs
N/A
### Acceptance Criteria
Backend:
- [ ] LogTypesAPI can handle both 'custom' and 'managed' schemas
- [ ] GitHub repository to store schemas with release and CI automations
- [ ] Panther can get release version from GitHub
- [ ] LogTypesAPI has `CheckSchemaUpdates` and `UpdateManagedSchemas` endpoints
- [ ] Panther defines a minimum version compatibility for every release and udpates schemas if needed during deployment
Frontend:
- [ ] Filters in the schema listing
- [ ] Schema listing page displays both Managed and Custom schemas
- [ ] Clone And Edit functionality on managed schemas
- [ ] Wording follows the concept of 'unified' schemas handling
- [ ] Details and listing view includes 'Managed' and 'Disabled' info.
- [ ] Settings section has a 'Schema Updates' section
A concise list of specific user stories that qualify this story as done and tell the user journey.
|
process
|
manage schemas in an external repository description our current management of native log types is bound to the releases of panther itself we want to manage our schemas in the same way we manage our rules in a repository rfc designs n a acceptance criteria backend logtypesapi can handle both custom and managed schemas github repository to store schemas with release and ci automations panther can get release version from github logtypesapi has checkschemaupdates and updatemanagedschemas endpoints panther defines a minimum version compatibility for every release and udpates schemas if needed during deployment frontend filters in the schema listing schema listing page displays both managed and custom schemas clone and edit functionality on managed schemas wording follows the concept of unified schemas handling details and listing view includes managed and disabled info settings section has a schema updates section a concise list of specific user stories that qualify this story as done and tell the user journey
| 1
|
58,865
| 6,621,947,698
|
IssuesEvent
|
2017-09-21 21:11:56
|
Princeton-CDH/cdh-web
|
https://api.github.com/repos/Princeton-CDH/cdh-web
|
closed
|
As an admin, I want to create and edit staff profiles so I can publish information about staff research and roles.
|
awaiting testing
|
## Notes for testing
- edit under profiles -> people
- the profile section defaults to collapsed but I haven't been able to figure out an easy way to change that, so we'll just have to live with that for now
- update 9/20 added username to edit form, should fix the problem encountered before
---
Detailed items that this should include are:
- [x] large photo for detail page
- [x] thumbnail photo for listing
- [x] job title
- [x] education section
- [x] contact info
- [x] paragraph text
|
1.0
|
As an admin, I want to create and edit staff profiles so I can publish information about staff research and roles. - ## Notes for testing
- edit under profiles -> people
- the profile section defaults to collapsed but I haven't been able to figure out an easy way to change that, so we'll just have to live with that for now
- update 9/20 added username to edit form, should fix the problem encountered before
---
Detailed items that this should include are:
- [x] large photo for detail page
- [x] thumbnail photo for listing
- [x] job title
- [x] education section
- [x] contact info
- [x] paragraph text
|
non_process
|
as an admin i want to create and edit staff profiles so i can publish information about staff research and roles notes for testing edit under profiles people the profile section defaults to collapsed but i haven t been able to figure out an easy way to change that so we ll just have to live with that for now update added username to edit form should fix the problem encountered before detailed items that this should include are large photo for detail page thumbnail photo for listing job title education section contact info paragraph text
| 0
|
164,162
| 6,220,084,074
|
IssuesEvent
|
2017-07-09 19:31:34
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
Batch normalization needs input dim checks
|
1hr dependency bug medium priority
|
Right now PyTorch batch_norm accepts any weight and running_mean shapes (eg 128 and 512) and doesn't complain at all, these checks https://github.com/torch/nn/blob/master/BatchNormalization.lua#L78 should probably be moved to C/CUDA side?
|
1.0
|
Batch normalization needs input dim checks - Right now PyTorch batch_norm accepts any weight and running_mean shapes (eg 128 and 512) and doesn't complain at all, these checks https://github.com/torch/nn/blob/master/BatchNormalization.lua#L78 should probably be moved to C/CUDA side?
|
non_process
|
batch normalization needs input dim checks right now pytorch batch norm accepts any weight and running mean shapes eg and and doesn t complain at all these checks should probably be moved to c cuda side
| 0
|
12,199
| 3,257,437,835
|
IssuesEvent
|
2015-10-20 17:52:11
|
kubernetes/kubernetes
|
https://api.github.com/repos/kubernetes/kubernetes
|
closed
|
kubernetes-upgrade-gke and kubernetes-upgrade-gce should run stable -> HEAD, not latest -> HEAD
|
area/cluster-lifecycle area/test area/upgrade priority/P1 team/control-plane
|
This is just a tracking bug so I don't forget to do this.
|
1.0
|
kubernetes-upgrade-gke and kubernetes-upgrade-gce should run stable -> HEAD, not latest -> HEAD - This is just a tracking bug so I don't forget to do this.
|
non_process
|
kubernetes upgrade gke and kubernetes upgrade gce should run stable head not latest head this is just a tracking bug so i don t forget to do this
| 0
|
245,284
| 26,539,973,297
|
IssuesEvent
|
2023-01-19 18:25:24
|
shaneclarke-whitesource/multi-juicer
|
https://api.github.com/repos/shaneclarke-whitesource/multi-juicer
|
closed
|
CVE-2021-44906 (High) detected in minimist-1.2.5.tgz - autoclosed
|
security vulnerability
|
## CVE-2021-44906 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>minimist-1.2.5.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz</a></p>
<p>Path to dependency file: /juice-balancer/package.json</p>
<p>Path to vulnerable library: /juice-balancer/node_modules/minimist/package.json,/juice-balancer/ui/node_modules/minimist/package.json,/cleaner/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- eslint-plugin-import-2.22.1.tgz
- tsconfig-paths-3.9.0.tgz
- :x: **minimist-1.2.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/shaneclarke-whitesource/multi-juicer/commit/0e0ec522551978737ae1ae4ffa66e0f7292e0fc7">0e0ec522551978737ae1ae4ffa66e0f7292e0fc7</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Minimist <=1.2.5 is vulnerable to Prototype Pollution via file index.js, function setKey() (lines 69-95).
<p>Publish Date: 2022-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-44906>CVE-2021-44906</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-03-17</p>
<p>Fix Resolution (minimist): 1.2.6</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
True
|
CVE-2021-44906 (High) detected in minimist-1.2.5.tgz - autoclosed - ## CVE-2021-44906 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>minimist-1.2.5.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz</a></p>
<p>Path to dependency file: /juice-balancer/package.json</p>
<p>Path to vulnerable library: /juice-balancer/node_modules/minimist/package.json,/juice-balancer/ui/node_modules/minimist/package.json,/cleaner/node_modules/minimist/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- eslint-plugin-import-2.22.1.tgz
- tsconfig-paths-3.9.0.tgz
- :x: **minimist-1.2.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/shaneclarke-whitesource/multi-juicer/commit/0e0ec522551978737ae1ae4ffa66e0f7292e0fc7">0e0ec522551978737ae1ae4ffa66e0f7292e0fc7</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Minimist <=1.2.5 is vulnerable to Prototype Pollution via file index.js, function setKey() (lines 69-95).
<p>Publish Date: 2022-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-44906>CVE-2021-44906</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-03-17</p>
<p>Fix Resolution (minimist): 1.2.6</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
non_process
|
cve high detected in minimist tgz autoclosed cve high severity vulnerability vulnerable library minimist tgz parse argument options library home page a href path to dependency file juice balancer package json path to vulnerable library juice balancer node modules minimist package json juice balancer ui node modules minimist package json cleaner node modules minimist package json dependency hierarchy react scripts tgz root library eslint plugin import tgz tsconfig paths tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist is vulnerable to prototype pollution via file index js function setkey lines publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution minimist direct dependency fix resolution react scripts rescue worker helmet automatic remediation is available for this issue
| 0
|
52,057
| 13,721,740,417
|
IssuesEvent
|
2020-10-03 00:06:56
|
Watemlifts/desktop
|
https://api.github.com/repos/Watemlifts/desktop
|
closed
|
CVE-2020-8203 (High) detected in lodash-3.10.1.tgz
|
no-issue-activity security vulnerability
|
## CVE-2020-8203 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/desktop/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/desktop/node_modules/ggit/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- stop-build-1.1.0.tgz (Root Library)
- ggit-1.15.1.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Watemlifts/desktop/commit/80887ae2896bc3d093a116931188bce797ec80d9">80887ae2896bc3d093a116931188bce797ec80d9</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution attack when using _.zipObjectDeep in lodash <= 4.17.15.
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8203>CVE-2020-8203</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1523">https://www.npmjs.com/advisories/1523</a></p>
<p>Release Date: 2020-07-23</p>
<p>Fix Resolution: lodash - 4.17.19</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-8203 (High) detected in lodash-3.10.1.tgz - ## CVE-2020-8203 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/desktop/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/desktop/node_modules/ggit/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- stop-build-1.1.0.tgz (Root Library)
- ggit-1.15.1.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Watemlifts/desktop/commit/80887ae2896bc3d093a116931188bce797ec80d9">80887ae2896bc3d093a116931188bce797ec80d9</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution attack when using _.zipObjectDeep in lodash <= 4.17.15.
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8203>CVE-2020-8203</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1523">https://www.npmjs.com/advisories/1523</a></p>
<p>Release Date: 2020-07-23</p>
<p>Fix Resolution: lodash - 4.17.19</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in lodash tgz cve high severity vulnerability vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file tmp ws scm desktop package json path to vulnerable library tmp ws scm desktop node modules ggit node modules lodash package json dependency hierarchy stop build tgz root library ggit tgz x lodash tgz vulnerable library found in head commit a href vulnerability details prototype pollution attack when using zipobjectdeep in lodash publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash step up your open source security game with whitesource
| 0
|
18,643
| 25,960,426,128
|
IssuesEvent
|
2022-12-18 20:38:35
|
ThizThizzyDizzy/tree-feller
|
https://api.github.com/repos/ThizThizzyDizzy/tree-feller
|
closed
|
Bug with MMOCore compatibility
|
bug compatibility
|
Environment:
Paper Spigot 1.16.5
MMOCore 1.7.1
Tree Feller 1.16.2
log when trying to cut down a "debug verified" tree:
**_[13:11:35 WARN]: [TreeFeller] Task #188604 for TreeFeller v1.16.2 generated an exception
java.lang.NoSuchMethodError: 'net.Indyuce.mmocore.api.player.PlayerData net.Indyuce.mmocore.api.player.PlayerData.get(org.bukkit.entity.Player)'
at com.thizthizzydizzy.treefeller.compat.MMOCoreCompat.breakBlock(MMOCoreCompat.java:253) ~[?:?]
at com.thizthizzydizzy.treefeller.compat.TreeFellerCompat.breakBlock(TreeFellerCompat.java:31) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller.breakBlock(TreeFeller.java:848) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller.access$200(TreeFeller.java:41) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller$1.run(TreeFeller.java:197) ~[?:?]
at org.bukkit.craftbukkit.v1_16_R3.scheduler.CraftTask.run(CraftTask.java:100) ~[patched_1.16.5.jar:git-Paper-779]
at org.bukkit.craftbukkit.v1_16_R3.scheduler.CraftScheduler.mainThreadHeartbeat(CraftScheduler.java:468) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.b(MinecraftServer.java:1427) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.DedicatedServer.b(DedicatedServer.java:436) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.a(MinecraftServer.java:1342) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.w(MinecraftServer.java:1130) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.lambda$a$0(MinecraftServer.java:291) ~[patched_1.16.5.jar:git-Paper-779]
at java.lang.Thread.run(Thread.java:829) [?:?]_**
|
True
|
Bug with MMOCore compatibility - Environment:
Paper Spigot 1.16.5
MMOCore 1.7.1
Tree Feller 1.16.2
log when trying to cut down a "debug verified" tree:
**_[13:11:35 WARN]: [TreeFeller] Task #188604 for TreeFeller v1.16.2 generated an exception
java.lang.NoSuchMethodError: 'net.Indyuce.mmocore.api.player.PlayerData net.Indyuce.mmocore.api.player.PlayerData.get(org.bukkit.entity.Player)'
at com.thizthizzydizzy.treefeller.compat.MMOCoreCompat.breakBlock(MMOCoreCompat.java:253) ~[?:?]
at com.thizthizzydizzy.treefeller.compat.TreeFellerCompat.breakBlock(TreeFellerCompat.java:31) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller.breakBlock(TreeFeller.java:848) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller.access$200(TreeFeller.java:41) ~[?:?]
at com.thizthizzydizzy.treefeller.TreeFeller$1.run(TreeFeller.java:197) ~[?:?]
at org.bukkit.craftbukkit.v1_16_R3.scheduler.CraftTask.run(CraftTask.java:100) ~[patched_1.16.5.jar:git-Paper-779]
at org.bukkit.craftbukkit.v1_16_R3.scheduler.CraftScheduler.mainThreadHeartbeat(CraftScheduler.java:468) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.b(MinecraftServer.java:1427) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.DedicatedServer.b(DedicatedServer.java:436) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.a(MinecraftServer.java:1342) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.w(MinecraftServer.java:1130) ~[patched_1.16.5.jar:git-Paper-779]
at net.minecraft.server.v1_16_R3.MinecraftServer.lambda$a$0(MinecraftServer.java:291) ~[patched_1.16.5.jar:git-Paper-779]
at java.lang.Thread.run(Thread.java:829) [?:?]_**
|
non_process
|
bug with mmocore compatibility environment paper spigot mmocore tree feller log when trying to cut down a debug verified tree task for treefeller generated an exception java lang nosuchmethoderror net indyuce mmocore api player playerdata net indyuce mmocore api player playerdata get org bukkit entity player at com thizthizzydizzy treefeller compat mmocorecompat breakblock mmocorecompat java at com thizthizzydizzy treefeller compat treefellercompat breakblock treefellercompat java at com thizthizzydizzy treefeller treefeller breakblock treefeller java at com thizthizzydizzy treefeller treefeller access treefeller java at com thizthizzydizzy treefeller treefeller run treefeller java at org bukkit craftbukkit scheduler crafttask run crafttask java at org bukkit craftbukkit scheduler craftscheduler mainthreadheartbeat craftscheduler java at net minecraft server minecraftserver b minecraftserver java at net minecraft server dedicatedserver b dedicatedserver java at net minecraft server minecraftserver a minecraftserver java at net minecraft server minecraftserver w minecraftserver java at net minecraft server minecraftserver lambda a minecraftserver java at java lang thread run thread java
| 0
|
2,506
| 5,281,160,577
|
IssuesEvent
|
2017-02-07 15:53:06
|
MikePopoloski/slang
|
https://api.github.com/repos/MikePopoloski/slang
|
closed
|
Implement __FILE__ and __LINE__ macros
|
area-preprocessor easy
|
The intrinsic __FILE__ and __LINE__ macros are currently recognized but generate empty tokens ("" and 0 respectively). The __LINE__ macro in particular will require making use of the source manager infrastructure for figuring out which line we're looking at.
Note that this plays into the `line directive as well, which changes the "apparent" line number (like #line in C++).
|
1.0
|
Implement __FILE__ and __LINE__ macros - The intrinsic __FILE__ and __LINE__ macros are currently recognized but generate empty tokens ("" and 0 respectively). The __LINE__ macro in particular will require making use of the source manager infrastructure for figuring out which line we're looking at.
Note that this plays into the `line directive as well, which changes the "apparent" line number (like #line in C++).
|
process
|
implement file and line macros the intrinsic file and line macros are currently recognized but generate empty tokens and respectively the line macro in particular will require making use of the source manager infrastructure for figuring out which line we re looking at note that this plays into the line directive as well which changes the apparent line number like line in c
| 1
|
210,408
| 23,754,701,481
|
IssuesEvent
|
2022-09-01 01:08:54
|
vital-ws/JS-WebGoat
|
https://api.github.com/repos/vital-ws/JS-WebGoat
|
opened
|
spring-boot-starter-validation-2.7.1.jar: 1 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-validation-2.7.1.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.30/snakeyaml-1.30.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-25857](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.30.jar | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25857</summary>
### Vulnerable Library - <b>snakeyaml-1.30.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.30/snakeyaml-1.30.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.7.1.jar (Root Library)
- spring-boot-starter-2.7.1.jar
- :x: **snakeyaml-1.30.jar** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections.
<p>Publish Date: 2022-08-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857>CVE-2022-25857</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p>
<p>Release Date: 2022-08-30</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
<p></p>
</details>
|
True
|
spring-boot-starter-validation-2.7.1.jar: 1 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-validation-2.7.1.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.30/snakeyaml-1.30.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-25857](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | snakeyaml-1.30.jar | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-25857</summary>
### Vulnerable Library - <b>snakeyaml-1.30.jar</b></p>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.30/snakeyaml-1.30.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.7.1.jar (Root Library)
- spring-boot-starter-2.7.1.jar
- :x: **snakeyaml-1.30.jar** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections.
<p>Publish Date: 2022-08-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-25857>CVE-2022-25857</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p>
<p>Release Date: 2022-08-30</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
<p></p>
</details>
|
non_process
|
spring boot starter validation jar vulnerabilities highest severity is vulnerable library spring boot starter validation jar path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar vulnerabilities cve severity cvss dependency type fixed in remediation available high snakeyaml jar transitive n a details cve vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter validation jar root library spring boot starter jar x snakeyaml jar vulnerable library found in base branch develop vulnerability details the package org yaml snakeyaml from and before are vulnerable to denial of service dos due missing to nested depth limitation for collections publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml
| 0
|
993
| 2,594,414,975
|
IssuesEvent
|
2015-02-20 03:04:32
|
BALL-Project/ball
|
https://api.github.com/repos/BALL-Project/ball
|
opened
|
PeptideCapProcessor does not generate BALL ResidueChecker approved fragments
|
C: BALL Core P: major T: defect
|
**Reported by akdehof on 29 May 41842109 10:17 UTC**
The resulting structures do not validate with the ResidueChecker:
(something like ResidueChecker: did not find... E:ACE::OXT (template was ACE-M)
did not find atom H of I:NME in the reference reisude NME-M -- seems like
the residue checker does not figure out the correct terminality of the residues
(probably incorrectly stored in the FragDB)
|
1.0
|
PeptideCapProcessor does not generate BALL ResidueChecker approved fragments - **Reported by akdehof on 29 May 41842109 10:17 UTC**
The resulting structures do not validate with the ResidueChecker:
(something like ResidueChecker: did not find... E:ACE::OXT (template was ACE-M)
did not find atom H of I:NME in the reference reisude NME-M -- seems like
the residue checker does not figure out the correct terminality of the residues
(probably incorrectly stored in the FragDB)
|
non_process
|
peptidecapprocessor does not generate ball residuechecker approved fragments reported by akdehof on may utc the resulting structures do not validate with the residuechecker something like residuechecker did not find e ace oxt template was ace m did not find atom h of i nme in the reference reisude nme m seems like the residue checker does not figure out the correct terminality of the residues probably incorrectly stored in the fragdb
| 0
|
450,151
| 31,884,322,821
|
IssuesEvent
|
2023-09-16 19:08:37
|
lodash/lodash
|
https://api.github.com/repos/lodash/lodash
|
closed
|
Document release process
|
documentation issue bankruptcy
|
There are a lot of intricacies in working with lodash-cli, which does a lot of regex magic to split the monolithic lodash.js file into separate files. This needs to be documented so we can reduce the bus factor of lodash, and ensure that releases are not broken (like the stretch of releases from v4.17.16 to v4.17.18).
|
1.0
|
Document release process - There are a lot of intricacies in working with lodash-cli, which does a lot of regex magic to split the monolithic lodash.js file into separate files. This needs to be documented so we can reduce the bus factor of lodash, and ensure that releases are not broken (like the stretch of releases from v4.17.16 to v4.17.18).
|
non_process
|
document release process there are a lot of intricacies in working with lodash cli which does a lot of regex magic to split the monolithic lodash js file into separate files this needs to be documented so we can reduce the bus factor of lodash and ensure that releases are not broken like the stretch of releases from to
| 0
|
26,603
| 13,065,669,645
|
IssuesEvent
|
2020-07-30 20:13:11
|
paaksing/django-cassiopeia
|
https://api.github.com/repos/paaksing/django-cassiopeia
|
closed
|
An improvement is planned to fasten large calculations
|
enhancement performance
|
# Issue presented by maintainer:
After making some production tests, the overall flow is slowing down when doing large and extense calculations. I am aware of this issue since today, the reason it is slowing down is due to the fact that the Django-Cache is not a python object, so it needs to be `pickled` to store it and `unpickled` to pull it back. each time it pickles or unpickles it takes around 0.02 sec (in my own pc), _**but then doing extense calculations, e.g. when fetching all items+runes+spells+champs+everything in a match it normally have to pull some 300 times, then the 0.02 sec can sum up pretty quickly to around 6 entire seconds.**_
# Temporary Solution (for definitive solution read below):
A temporary solution in case you would need to do this __kind__ of calculations would be to add the standard `Cache` from the original `cassiopeia` (no need to install, it is a dependency) right before the `DjangoCache` *__with a LOW AMOUNT OF EXPIRATION to prevent memory issues, then at the end of your calculations make sure to flush the cache (a lot better if you do it at start too)__*. As follow:
* Change the `CASSIOPEIA_PIPELINE` in your settings.py
```python
CASSIOPEIA_PIPELINE = {
# Include the cassiopeia.core objects that you see is constantly pulling
# e.g. champions, runes, items, summoner spells
# Do not cache datas that are pull once, e.g. Matches in the scenario above
"Cache" : {
"expirations_map" : {
td(minutes=30): ["r", "r+", "i", "i+", "ss", "ss+", "c", "c+"],
0: ["*+"]
},
},
"DjangoCache" : {},
"DDragon": {},
"RiotAPI": {},
}
```
* Now in your extense calculated functions:
```python
from django_cassiopeia import cassiopeia as cass
def imhuge(args):
cass.configuration.settings.expire_sinks()
# flush at start
# big calculation going here
# flush at end
cass.configuration.settings.expire_sinks()
```
Above is to make sure you don't run into memory issues and also get your performance good temporary
# Definitive Fix is scheduled on version 1.2.0
A fix to this is scheduled on next release, __expected realease time around August with the next big boy release: Django 3.1__**. The idea of this fix is to give a temporary python object Cache for each long term extensive calculation (which most likely done periodically), the mechanism used may be in a form similar to `@method_decorator` which you can pass the needed time to stay inside the object, and its memory usage is aborted after function execution.
*__Currently I am deciding between if this object is accessible globally during the execution of the main function, meaning that other functions that runs in pararel will benefit from it too, or just restrict to functions that has the decorator (which needs more work for me to do lol) and possibly role out a more perfect rate limiter.__*
|
True
|
An improvement is planned to fasten large calculations - # Issue presented by maintainer:
After making some production tests, the overall flow is slowing down when doing large and extense calculations. I am aware of this issue since today, the reason it is slowing down is due to the fact that the Django-Cache is not a python object, so it needs to be `pickled` to store it and `unpickled` to pull it back. each time it pickles or unpickles it takes around 0.02 sec (in my own pc), _**but then doing extense calculations, e.g. when fetching all items+runes+spells+champs+everything in a match it normally have to pull some 300 times, then the 0.02 sec can sum up pretty quickly to around 6 entire seconds.**_
# Temporary Solution (for definitive solution read below):
A temporary solution in case you would need to do this __kind__ of calculations would be to add the standard `Cache` from the original `cassiopeia` (no need to install, it is a dependency) right before the `DjangoCache` *__with a LOW AMOUNT OF EXPIRATION to prevent memory issues, then at the end of your calculations make sure to flush the cache (a lot better if you do it at start too)__*. As follow:
* Change the `CASSIOPEIA_PIPELINE` in your settings.py
```python
CASSIOPEIA_PIPELINE = {
# Include the cassiopeia.core objects that you see is constantly pulling
# e.g. champions, runes, items, summoner spells
# Do not cache datas that are pull once, e.g. Matches in the scenario above
"Cache" : {
"expirations_map" : {
td(minutes=30): ["r", "r+", "i", "i+", "ss", "ss+", "c", "c+"],
0: ["*+"]
},
},
"DjangoCache" : {},
"DDragon": {},
"RiotAPI": {},
}
```
* Now in your extense calculated functions:
```python
from django_cassiopeia import cassiopeia as cass
def imhuge(args):
cass.configuration.settings.expire_sinks()
# flush at start
# big calculation going here
# flush at end
cass.configuration.settings.expire_sinks()
```
Above is to make sure you don't run into memory issues and also get your performance good temporary
# Definitive Fix is scheduled on version 1.2.0
A fix to this is scheduled on next release, __expected realease time around August with the next big boy release: Django 3.1__**. The idea of this fix is to give a temporary python object Cache for each long term extensive calculation (which most likely done periodically), the mechanism used may be in a form similar to `@method_decorator` which you can pass the needed time to stay inside the object, and its memory usage is aborted after function execution.
*__Currently I am deciding between if this object is accessible globally during the execution of the main function, meaning that other functions that runs in pararel will benefit from it too, or just restrict to functions that has the decorator (which needs more work for me to do lol) and possibly role out a more perfect rate limiter.__*
|
non_process
|
an improvement is planned to fasten large calculations issue presented by maintainer after making some production tests the overall flow is slowing down when doing large and extense calculations i am aware of this issue since today the reason it is slowing down is due to the fact that the django cache is not a python object so it needs to be pickled to store it and unpickled to pull it back each time it pickles or unpickles it takes around sec in my own pc but then doing extense calculations e g when fetching all items runes spells champs everything in a match it normally have to pull some times then the sec can sum up pretty quickly to around entire seconds temporary solution for definitive solution read below a temporary solution in case you would need to do this kind of calculations would be to add the standard cache from the original cassiopeia no need to install it is a dependency right before the djangocache with a low amount of expiration to prevent memory issues then at the end of your calculations make sure to flush the cache a lot better if you do it at start too as follow change the cassiopeia pipeline in your settings py python cassiopeia pipeline include the cassiopeia core objects that you see is constantly pulling e g champions runes items summoner spells do not cache datas that are pull once e g matches in the scenario above cache expirations map td minutes djangocache ddragon riotapi now in your extense calculated functions python from django cassiopeia import cassiopeia as cass def imhuge args cass configuration settings expire sinks flush at start big calculation going here flush at end cass configuration settings expire sinks above is to make sure you don t run into memory issues and also get your performance good temporary definitive fix is scheduled on version a fix to this is scheduled on next release expected realease time around august with the next big boy release django the idea of this fix is to give a temporary python object cache for each long term extensive calculation which most likely done periodically the mechanism used may be in a form similar to method decorator which you can pass the needed time to stay inside the object and its memory usage is aborted after function execution currently i am deciding between if this object is accessible globally during the execution of the main function meaning that other functions that runs in pararel will benefit from it too or just restrict to functions that has the decorator which needs more work for me to do lol and possibly role out a more perfect rate limiter
| 0
|
266,433
| 28,326,161,815
|
IssuesEvent
|
2023-04-11 07:22:05
|
UK-Export-Finance/nestjs-template
|
https://api.github.com/repos/UK-Export-Finance/nestjs-template
|
closed
|
typeorm-0.3.13.tgz: 1 vulnerabilities (highest severity is: 5.5)
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>typeorm-0.3.13.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xml2js/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/UK-Export-Finance/nestjs-template/commit/6890bd27a868af409cbaea667c2899e8a45887f1">6890bd27a868af409cbaea667c2899e8a45887f1</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (typeorm version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2023-0842](https://www.mend.io/vulnerability-database/CVE-2023-0842) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.5 | xml2js-0.4.23.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2023-0842</summary>
### Vulnerable Library - <b>xml2js-0.4.23.tgz</b></p>
<p>Simple XML to JavaScript object converter.</p>
<p>Library home page: <a href="https://registry.npmjs.org/xml2js/-/xml2js-0.4.23.tgz">https://registry.npmjs.org/xml2js/-/xml2js-0.4.23.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xml2js/package.json</p>
<p>
Dependency Hierarchy:
- typeorm-0.3.13.tgz (Root Library)
- :x: **xml2js-0.4.23.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UK-Export-Finance/nestjs-template/commit/6890bd27a868af409cbaea667c2899e8a45887f1">6890bd27a868af409cbaea667c2899e8a45887f1</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
xml2js version 0.4.23 allows an external attacker to edit or add new properties to an object. This is possible because the application does not properly validate incoming JSON keys, thus allowing the __proto__ property to be edited.
<p>Publish Date: 2023-04-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-0842>CVE-2023-0842</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
True
|
typeorm-0.3.13.tgz: 1 vulnerabilities (highest severity is: 5.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>typeorm-0.3.13.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xml2js/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/UK-Export-Finance/nestjs-template/commit/6890bd27a868af409cbaea667c2899e8a45887f1">6890bd27a868af409cbaea667c2899e8a45887f1</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (typeorm version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2023-0842](https://www.mend.io/vulnerability-database/CVE-2023-0842) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.5 | xml2js-0.4.23.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2023-0842</summary>
### Vulnerable Library - <b>xml2js-0.4.23.tgz</b></p>
<p>Simple XML to JavaScript object converter.</p>
<p>Library home page: <a href="https://registry.npmjs.org/xml2js/-/xml2js-0.4.23.tgz">https://registry.npmjs.org/xml2js/-/xml2js-0.4.23.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xml2js/package.json</p>
<p>
Dependency Hierarchy:
- typeorm-0.3.13.tgz (Root Library)
- :x: **xml2js-0.4.23.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/UK-Export-Finance/nestjs-template/commit/6890bd27a868af409cbaea667c2899e8a45887f1">6890bd27a868af409cbaea667c2899e8a45887f1</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
xml2js version 0.4.23 allows an external attacker to edit or add new properties to an object. This is possible because the application does not properly validate incoming JSON keys, thus allowing the __proto__ property to be edited.
<p>Publish Date: 2023-04-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-0842>CVE-2023-0842</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
non_process
|
typeorm tgz vulnerabilities highest severity is vulnerable library typeorm tgz path to dependency file package json path to vulnerable library node modules package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in typeorm version remediation available medium tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library tgz simple xml to javascript object converter library home page a href path to dependency file package json path to vulnerable library node modules package json dependency hierarchy typeorm tgz root library x tgz vulnerable library found in head commit a href found in base branch main vulnerability details version allows an external attacker to edit or add new properties to an object this is possible because the application does not properly validate incoming json keys thus allowing the proto property to be edited publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with mend
| 0
|
115,391
| 14,739,398,084
|
IssuesEvent
|
2021-01-07 07:07:31
|
discreetlogcontracts/dlcspecs
|
https://api.github.com/repos/discreetlogcontracts/dlcspecs
|
closed
|
(Tagged) Hashing and Signing by Oracle of Message
|
design oracle
|
As @LLFourn mentions [here](https://github.com/discreetlogcontracts/dlcspecs/pull/55#discussion_r515442902), we need to specify a signing algorithm which includes a tagged hash for oracles signing messages.
We also need to decide what goes into this signed message other than just the outcome (e.g. the event id as well?)
|
1.0
|
(Tagged) Hashing and Signing by Oracle of Message - As @LLFourn mentions [here](https://github.com/discreetlogcontracts/dlcspecs/pull/55#discussion_r515442902), we need to specify a signing algorithm which includes a tagged hash for oracles signing messages.
We also need to decide what goes into this signed message other than just the outcome (e.g. the event id as well?)
|
non_process
|
tagged hashing and signing by oracle of message as llfourn mentions we need to specify a signing algorithm which includes a tagged hash for oracles signing messages we also need to decide what goes into this signed message other than just the outcome e g the event id as well
| 0
|
51,614
| 13,635,148,591
|
IssuesEvent
|
2020-09-25 02:02:21
|
nasifimtiazohi/openmrs-module-metadatamapping-1.3.4
|
https://api.github.com/repos/nasifimtiazohi/openmrs-module-metadatamapping-1.3.4
|
opened
|
CVE-2016-10542 (High) detected in ws-0.8.0.tgz
|
security vulnerability
|
## CVE-2016-10542 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ws-0.8.0.tgz</b></p></summary>
<p>simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455</p>
<p>Library home page: <a href="https://registry.npmjs.org/ws/-/ws-0.8.0.tgz">https://registry.npmjs.org/ws/-/ws-0.8.0.tgz</a></p>
<p>Path to dependency file: openmrs-module-metadatamapping-1.3.4/owa/package.json</p>
<p>Path to vulnerable library: openmrs-module-metadatamapping-1.3.4/owa/node_modules/ws/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.11.1.tgz (Root Library)
- socket.io-1.3.7.tgz
- engine.io-1.5.4.tgz
- :x: **ws-0.8.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nasifimtiazohi/openmrs-module-metadatamapping-1.3.4/commit/dbf14247c8c0a7b64ae301a8ab42df19cc87107e">dbf14247c8c0a7b64ae301a8ab42df19cc87107e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ws is a "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455". By sending an overly long websocket payload to a `ws` server, it is possible to crash the node process. This affects ws 1.1.0 and earlier.
<p>Publish Date: 2018-05-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10542>CVE-2016-10542</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858</a></p>
<p>Release Date: 2018-12-15</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2016-10542 (High) detected in ws-0.8.0.tgz - ## CVE-2016-10542 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ws-0.8.0.tgz</b></p></summary>
<p>simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455</p>
<p>Library home page: <a href="https://registry.npmjs.org/ws/-/ws-0.8.0.tgz">https://registry.npmjs.org/ws/-/ws-0.8.0.tgz</a></p>
<p>Path to dependency file: openmrs-module-metadatamapping-1.3.4/owa/package.json</p>
<p>Path to vulnerable library: openmrs-module-metadatamapping-1.3.4/owa/node_modules/ws/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.11.1.tgz (Root Library)
- socket.io-1.3.7.tgz
- engine.io-1.5.4.tgz
- :x: **ws-0.8.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nasifimtiazohi/openmrs-module-metadatamapping-1.3.4/commit/dbf14247c8c0a7b64ae301a8ab42df19cc87107e">dbf14247c8c0a7b64ae301a8ab42df19cc87107e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ws is a "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455". By sending an overly long websocket payload to a `ws` server, it is possible to crash the node process. This affects ws 1.1.0 and earlier.
<p>Publish Date: 2018-05-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10542>CVE-2016-10542</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858</a></p>
<p>Release Date: 2018-12-15</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in ws tgz cve high severity vulnerability vulnerable library ws tgz simple to use blazing fast and thoroughly tested websocket client server and console for node js up to date against rfc library home page a href path to dependency file openmrs module metadatamapping owa package json path to vulnerable library openmrs module metadatamapping owa node modules ws package json dependency hierarchy browser sync tgz root library socket io tgz engine io tgz x ws tgz vulnerable library found in head commit a href found in base branch master vulnerability details ws is a simple to use blazing fast and thoroughly tested websocket client server and console for node js up to date against rfc by sending an overly long websocket payload to a ws server it is possible to crash the node process this affects ws and earlier publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
171,206
| 13,223,621,388
|
IssuesEvent
|
2020-08-17 17:34:01
|
phetsims/QA
|
https://api.github.com/repos/phetsims/QA
|
closed
|
Android App: Dev test 4 (1.1-dev.4)
|
QA:dev-test priority:2-high
|
QA, please conduct another dev test of the android app.
The internal test track version can be installed from this link: https://play.google.com/apps/internaltest/4697762600441754675.
NOTE: Please delete the existing "PhET" app before installing this one. It is OK if the previous production app ("PhET Simulations") is still installed, but previous dev versions of this app can cause problems if they remain.
Please check the following:
- [x] phetsims/phet-android-app#131 Exit bar overlaps search bar
- [x] phetsims/phet-android-app#128 Dark colored version of website within app
- [x] phetsims/phet-android-app#126 Orientation change not triggering column reflow for sim layout
- [x] phetsims/phet-android-app#125 Sim titles not properly wrapping in list view
- [x] phetsims/phet-android-app#123 Layout issues and lack of scrolling on info pages
- [x] phetsims/phet-android-app#120 Cursor does not change when it should
- [x] phetsims/phet-android-app#119 Clicking on bottom bar can enter sims
- [x] phetsims/phet-android-app#117 Empty filter list says simulations are loading
- [x] phetsims/phet-android-app#116 Can's see phet button/view change button if list is too short
- [x] phetsims/phet-android-app#115 Unable to scroll to bottom of list view with filters
- [x] phetsims/phet-android-app#114 Continue button too far down
- [x] phetsims/phet-android-app#113 In-sim favorite button isn't active for favorite sims
- [x] phetsims/phet-android-app#112 Hard to hit info button
- [x] phetsims/phet-android-app#111 Info cut off for sim info if tablet is in portrait mode
- [x] phetsims/phet-android-app#106 Reset Favorites and Update Simulations do nothing
- [x] phetsims/phet-android-app#100 Search bar doesn't deactivate
- [x] phetsims/phet-android-app#97 Some sims don't open
Additionally, please test the app in the spanish locale. All strings should literally be "xxxx", except for sim titles and descriptions (which should be in spanish if they are translated on phet.colorado.edu). The is for phetsims/phet-android-app#109
<hr>
There are a few issues from last time that aren't fixed yet, including keyboard navigation issues and a couple that have open questions.
@ariel-phet - What is the priority of this test?
|
1.0
|
Android App: Dev test 4 (1.1-dev.4) - QA, please conduct another dev test of the android app.
The internal test track version can be installed from this link: https://play.google.com/apps/internaltest/4697762600441754675.
NOTE: Please delete the existing "PhET" app before installing this one. It is OK if the previous production app ("PhET Simulations") is still installed, but previous dev versions of this app can cause problems if they remain.
Please check the following:
- [x] phetsims/phet-android-app#131 Exit bar overlaps search bar
- [x] phetsims/phet-android-app#128 Dark colored version of website within app
- [x] phetsims/phet-android-app#126 Orientation change not triggering column reflow for sim layout
- [x] phetsims/phet-android-app#125 Sim titles not properly wrapping in list view
- [x] phetsims/phet-android-app#123 Layout issues and lack of scrolling on info pages
- [x] phetsims/phet-android-app#120 Cursor does not change when it should
- [x] phetsims/phet-android-app#119 Clicking on bottom bar can enter sims
- [x] phetsims/phet-android-app#117 Empty filter list says simulations are loading
- [x] phetsims/phet-android-app#116 Can's see phet button/view change button if list is too short
- [x] phetsims/phet-android-app#115 Unable to scroll to bottom of list view with filters
- [x] phetsims/phet-android-app#114 Continue button too far down
- [x] phetsims/phet-android-app#113 In-sim favorite button isn't active for favorite sims
- [x] phetsims/phet-android-app#112 Hard to hit info button
- [x] phetsims/phet-android-app#111 Info cut off for sim info if tablet is in portrait mode
- [x] phetsims/phet-android-app#106 Reset Favorites and Update Simulations do nothing
- [x] phetsims/phet-android-app#100 Search bar doesn't deactivate
- [x] phetsims/phet-android-app#97 Some sims don't open
Additionally, please test the app in the spanish locale. All strings should literally be "xxxx", except for sim titles and descriptions (which should be in spanish if they are translated on phet.colorado.edu). The is for phetsims/phet-android-app#109
<hr>
There are a few issues from last time that aren't fixed yet, including keyboard navigation issues and a couple that have open questions.
@ariel-phet - What is the priority of this test?
|
non_process
|
android app dev test dev qa please conduct another dev test of the android app the internal test track version can be installed from this link note please delete the existing phet app before installing this one it is ok if the previous production app phet simulations is still installed but previous dev versions of this app can cause problems if they remain please check the following phetsims phet android app exit bar overlaps search bar phetsims phet android app dark colored version of website within app phetsims phet android app orientation change not triggering column reflow for sim layout phetsims phet android app sim titles not properly wrapping in list view phetsims phet android app layout issues and lack of scrolling on info pages phetsims phet android app cursor does not change when it should phetsims phet android app clicking on bottom bar can enter sims phetsims phet android app empty filter list says simulations are loading phetsims phet android app can s see phet button view change button if list is too short phetsims phet android app unable to scroll to bottom of list view with filters phetsims phet android app continue button too far down phetsims phet android app in sim favorite button isn t active for favorite sims phetsims phet android app hard to hit info button phetsims phet android app info cut off for sim info if tablet is in portrait mode phetsims phet android app reset favorites and update simulations do nothing phetsims phet android app search bar doesn t deactivate phetsims phet android app some sims don t open additionally please test the app in the spanish locale all strings should literally be xxxx except for sim titles and descriptions which should be in spanish if they are translated on phet colorado edu the is for phetsims phet android app there are a few issues from last time that aren t fixed yet including keyboard navigation issues and a couple that have open questions ariel phet what is the priority of this test
| 0
|
143,506
| 22,060,235,516
|
IssuesEvent
|
2022-05-30 16:45:54
|
iotaledger/explorer
|
https://api.github.com/repos/iotaledger/explorer
|
closed
|
[Task]: Design for alias address metadata
|
priority:2 type:feature type:ux:design network:shimmer
|
### Task description
As part of the design changes we have to do for explorer, we need to display the information associated with an _alias_ address when a user searches for it.
[Link](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg) to the user stories document.
[Ticket](https://github.com/iotaledger/explorer/issues/232) for address search design.
### Requirements
When a user searches for an alias address, it should not only display the balance of the Alias (as described in [this ticket](https://github.com/iotaledger/explorer/issues/232)), but should also have other (foundry, state index, state metadata) specific information.
### Acceptance criteria
When the alias address is searched for, it should display:
- [Foundries controller](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg?both#Foundry-output-JSON-example) by the alias;
- [State index](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg?both#alias-state-index) of the alias;
- State metadata for the alias
### Creation checklist
- [ ] I have assigned this task to the correct people
- [X] I have added the most appropriate labels
- [x] I have linked the correct milestone and/or project
|
1.0
|
[Task]: Design for alias address metadata - ### Task description
As part of the design changes we have to do for explorer, we need to display the information associated with an _alias_ address when a user searches for it.
[Link](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg) to the user stories document.
[Ticket](https://github.com/iotaledger/explorer/issues/232) for address search design.
### Requirements
When a user searches for an alias address, it should not only display the balance of the Alias (as described in [this ticket](https://github.com/iotaledger/explorer/issues/232)), but should also have other (foundry, state index, state metadata) specific information.
### Acceptance criteria
When the alias address is searched for, it should display:
- [Foundries controller](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg?both#Foundry-output-JSON-example) by the alias;
- [State index](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg?both#alias-state-index) of the alias;
- State metadata for the alias
### Creation checklist
- [ ] I have assigned this task to the correct people
- [X] I have added the most appropriate labels
- [x] I have linked the correct milestone and/or project
|
non_process
|
design for alias address metadata task description as part of the design changes we have to do for explorer we need to display the information associated with an alias address when a user searches for it to the user stories document for address search design requirements when a user searches for an alias address it should not only display the balance of the alias as described in but should also have other foundry state index state metadata specific information acceptance criteria when the alias address is searched for it should display by the alias of the alias state metadata for the alias creation checklist i have assigned this task to the correct people i have added the most appropriate labels i have linked the correct milestone and or project
| 0
|
54,975
| 13,943,438,942
|
IssuesEvent
|
2020-10-22 23:07:29
|
GooseWSS/node-express-realworld-example-app
|
https://api.github.com/repos/GooseWSS/node-express-realworld-example-app
|
opened
|
CVE-2018-3728 (High) detected in hoek-2.16.3.tgz
|
security vulnerability
|
## CVE-2018-3728 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hoek-2.16.3.tgz</b></p></summary>
<p>General purpose node utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/hoek/-/hoek-2.16.3.tgz">https://registry.npmjs.org/hoek/-/hoek-2.16.3.tgz</a></p>
<p>Path to dependency file: node-express-realworld-example-app/package.json</p>
<p>Path to vulnerable library: node-express-realworld-example-app/node_modules/hoek/package.json</p>
<p>
Dependency Hierarchy:
- jsonwebtoken-7.1.9.tgz (Root Library)
- joi-6.10.1.tgz
- :x: **hoek-2.16.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/GooseWSS/node-express-realworld-example-app/commit/532d31bcb4a1ee884633f7e684aab7c5bb204ca5">532d31bcb4a1ee884633f7e684aab7c5bb204ca5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
hoek node module before 4.2.0 and 5.0.x before 5.0.3 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via 'merge' and 'applyToDefaults' functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects.
<p>Publish Date: 2018-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3728>CVE-2018-3728</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3728">https://nvd.nist.gov/vuln/detail/CVE-2018-3728</a></p>
<p>Release Date: 2018-03-30</p>
<p>Fix Resolution: 4.2.1,5.0.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"hoek","packageVersion":"2.16.3","isTransitiveDependency":true,"dependencyTree":"jsonwebtoken:7.1.9;joi:6.10.1;hoek:2.16.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.2.1,5.0.3"}],"vulnerabilityIdentifier":"CVE-2018-3728","vulnerabilityDetails":"hoek node module before 4.2.0 and 5.0.x before 5.0.3 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via \u0027merge\u0027 and \u0027applyToDefaults\u0027 functions, which allows a malicious user to modify the prototype of \"Object\" via __proto__, causing the addition or modification of an existing property that will exist on all objects.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3728","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-3728 (High) detected in hoek-2.16.3.tgz - ## CVE-2018-3728 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hoek-2.16.3.tgz</b></p></summary>
<p>General purpose node utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/hoek/-/hoek-2.16.3.tgz">https://registry.npmjs.org/hoek/-/hoek-2.16.3.tgz</a></p>
<p>Path to dependency file: node-express-realworld-example-app/package.json</p>
<p>Path to vulnerable library: node-express-realworld-example-app/node_modules/hoek/package.json</p>
<p>
Dependency Hierarchy:
- jsonwebtoken-7.1.9.tgz (Root Library)
- joi-6.10.1.tgz
- :x: **hoek-2.16.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/GooseWSS/node-express-realworld-example-app/commit/532d31bcb4a1ee884633f7e684aab7c5bb204ca5">532d31bcb4a1ee884633f7e684aab7c5bb204ca5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
hoek node module before 4.2.0 and 5.0.x before 5.0.3 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via 'merge' and 'applyToDefaults' functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects.
<p>Publish Date: 2018-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3728>CVE-2018-3728</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3728">https://nvd.nist.gov/vuln/detail/CVE-2018-3728</a></p>
<p>Release Date: 2018-03-30</p>
<p>Fix Resolution: 4.2.1,5.0.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"hoek","packageVersion":"2.16.3","isTransitiveDependency":true,"dependencyTree":"jsonwebtoken:7.1.9;joi:6.10.1;hoek:2.16.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.2.1,5.0.3"}],"vulnerabilityIdentifier":"CVE-2018-3728","vulnerabilityDetails":"hoek node module before 4.2.0 and 5.0.x before 5.0.3 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via \u0027merge\u0027 and \u0027applyToDefaults\u0027 functions, which allows a malicious user to modify the prototype of \"Object\" via __proto__, causing the addition or modification of an existing property that will exist on all objects.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3728","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in hoek tgz cve high severity vulnerability vulnerable library hoek tgz general purpose node utilities library home page a href path to dependency file node express realworld example app package json path to vulnerable library node express realworld example app node modules hoek package json dependency hierarchy jsonwebtoken tgz root library joi tgz x hoek tgz vulnerable library found in head commit a href found in base branch master vulnerability details hoek node module before and x before suffers from a modification of assumed immutable data maid vulnerability via merge and applytodefaults functions which allows a malicious user to modify the prototype of object via proto causing the addition or modification of an existing property that will exist on all objects publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails hoek node module before and x before suffers from a modification of assumed immutable data maid vulnerability via and functions which allows a malicious user to modify the prototype of object via proto causing the addition or modification of an existing property that will exist on all objects vulnerabilityurl
| 0
|
22,681
| 4,826,989,552
|
IssuesEvent
|
2016-11-07 12:04:00
|
jiscdev/analytics-udd
|
https://api.github.com/repos/jiscdev/analytics-udd
|
closed
|
UDD 1.2.6+ Identification of Primary Keys for all UDD Entities (can be done for 1.2.7 & onwards?)
|
Documentation change
|
Hi Wilbert,
Having spoken to Huw this morning, and had a quick look at the UDD field-level documentation, can we build in (to 1.2.7 please, prior to any release) a primary key indicator for all entities please?
This would be picked-up by Huw’s initialisation for the UDD validation tool currently under construction.
Could this be done with a similar approach for mandatory/ optional denotation done previously.
Many Thanks,
Rob
|
1.0
|
UDD 1.2.6+ Identification of Primary Keys for all UDD Entities (can be done for 1.2.7 & onwards?) - Hi Wilbert,
Having spoken to Huw this morning, and had a quick look at the UDD field-level documentation, can we build in (to 1.2.7 please, prior to any release) a primary key indicator for all entities please?
This would be picked-up by Huw’s initialisation for the UDD validation tool currently under construction.
Could this be done with a similar approach for mandatory/ optional denotation done previously.
Many Thanks,
Rob
|
non_process
|
udd identification of primary keys for all udd entities can be done for onwards hi wilbert having spoken to huw this morning and had a quick look at the udd field level documentation can we build in to please prior to any release a primary key indicator for all entities please this would be picked up by huw’s initialisation for the udd validation tool currently under construction could this be done with a similar approach for mandatory optional denotation done previously many thanks rob
| 0
|
5,628
| 8,482,018,536
|
IssuesEvent
|
2018-10-25 17:19:42
|
easy-software-ufal/annotations_repos
|
https://api.github.com/repos/easy-software-ufal/annotations_repos
|
opened
|
dotnet/BenchmarkDotNet IterationSetup is not running before each benchmark invocation
|
ADA C# test wrong processing
|
Issue:`https://github.com/dotnet/BenchmarkDotNet/issues/730`
PR:`https://github.com/alinasmirnova/BenchmarkDotNet/commit/2c0947463612ef59d5fb0862bc952d84c930c54e`
|
1.0
|
dotnet/BenchmarkDotNet IterationSetup is not running before each benchmark invocation - Issue:`https://github.com/dotnet/BenchmarkDotNet/issues/730`
PR:`https://github.com/alinasmirnova/BenchmarkDotNet/commit/2c0947463612ef59d5fb0862bc952d84c930c54e`
|
process
|
dotnet benchmarkdotnet iterationsetup is not running before each benchmark invocation issue pr
| 1
|
138,720
| 20,674,469,689
|
IssuesEvent
|
2022-03-10 07:46:50
|
Wayodeni/ec
|
https://api.github.com/repos/Wayodeni/ec
|
closed
|
РЕДИЗАЙН ВСЕГО
|
design
|
- [x] Избавиться от закруглений
- [x] Изменть меню выбора категорий товаров
- [x] Уменьшить все шрифты
- [x] Изменить каталог товаров
**Меню товара**

**Главная страница**

**Вариант каталога 1**

**Вариант каталога 2**

.
|
1.0
|
РЕДИЗАЙН ВСЕГО - - [x] Избавиться от закруглений
- [x] Изменть меню выбора категорий товаров
- [x] Уменьшить все шрифты
- [x] Изменить каталог товаров
**Меню товара**

**Главная страница**

**Вариант каталога 1**

**Вариант каталога 2**

.
|
non_process
|
редизайн всего избавиться от закруглений изменть меню выбора категорий товаров уменьшить все шрифты изменить каталог товаров меню товара главная страница вариант каталога вариант каталога
| 0
|
19,391
| 25,530,474,192
|
IssuesEvent
|
2022-11-29 07:58:55
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR: endothelial cell-matrix adhesion via collagen
|
New term request cellular processes
|
Please provide as much information as you can:
* **Suggested term label:**
1. endothelial cell-matrix adhesion via collagen
2. endothelial cell-matrix adhesion via laminin
3. endothelial cell-matrix adhesion via vitronectin
* **Definition (free text)**
1. The binding of an endothelial cell to the extracellular matrix via collagen.
2. The binding of an endothelial cell to the extracellular matrix via laminin.
3. The binding of an endothelial cell to the extracellular matrix via vitronectin.
* **Reference, in format PMID:#######**
PMID:8442917
* **Gene product name and ID to be annotated to this term**
Dicer1, RGD:1309381
* **Parent term(s)**
endothelial cell-matrix adhesion (GO:0090673)
* **Children terms (if applicable)** Should any existing terms that should be moved underneath this new proposed term?
1., 2., 3. The corresponding regulation, positive, and negative regulation terms are also requested.
* **Synonyms (please specify, EXACT, BROAD, NARROW or RELATED)**
* **Cross-references**
* For enzymes, please provide RHEA and/or EC numbers.
* Can also provide MetaCyc, KEGG, Wikipedia, and other links.
* **Any other information**
|
1.0
|
NTR: endothelial cell-matrix adhesion via collagen - Please provide as much information as you can:
* **Suggested term label:**
1. endothelial cell-matrix adhesion via collagen
2. endothelial cell-matrix adhesion via laminin
3. endothelial cell-matrix adhesion via vitronectin
* **Definition (free text)**
1. The binding of an endothelial cell to the extracellular matrix via collagen.
2. The binding of an endothelial cell to the extracellular matrix via laminin.
3. The binding of an endothelial cell to the extracellular matrix via vitronectin.
* **Reference, in format PMID:#######**
PMID:8442917
* **Gene product name and ID to be annotated to this term**
Dicer1, RGD:1309381
* **Parent term(s)**
endothelial cell-matrix adhesion (GO:0090673)
* **Children terms (if applicable)** Should any existing terms that should be moved underneath this new proposed term?
1., 2., 3. The corresponding regulation, positive, and negative regulation terms are also requested.
* **Synonyms (please specify, EXACT, BROAD, NARROW or RELATED)**
* **Cross-references**
* For enzymes, please provide RHEA and/or EC numbers.
* Can also provide MetaCyc, KEGG, Wikipedia, and other links.
* **Any other information**
|
process
|
ntr endothelial cell matrix adhesion via collagen please provide as much information as you can suggested term label endothelial cell matrix adhesion via collagen endothelial cell matrix adhesion via laminin endothelial cell matrix adhesion via vitronectin definition free text the binding of an endothelial cell to the extracellular matrix via collagen the binding of an endothelial cell to the extracellular matrix via laminin the binding of an endothelial cell to the extracellular matrix via vitronectin reference in format pmid pmid gene product name and id to be annotated to this term rgd parent term s endothelial cell matrix adhesion go children terms if applicable should any existing terms that should be moved underneath this new proposed term the corresponding regulation positive and negative regulation terms are also requested synonyms please specify exact broad narrow or related cross references for enzymes please provide rhea and or ec numbers can also provide metacyc kegg wikipedia and other links any other information
| 1
|
11,630
| 14,489,497,658
|
IssuesEvent
|
2020-12-11 00:03:39
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
fix update kubernetes version script
|
kind/process priority/important-longterm
|
1- it wont run if u run it from root folder (have to cd into it)
2- it didnt update to newest version
```
DefaultKubernetesVersion = "v1.20.0"
NewestKubernetesVersion = "v1.20.0"
```
I expected v1.20.1-rc.0 for newst
3- it didnt run "make generate-docs"
|
1.0
|
fix update kubernetes version script - 1- it wont run if u run it from root folder (have to cd into it)
2- it didnt update to newest version
```
DefaultKubernetesVersion = "v1.20.0"
NewestKubernetesVersion = "v1.20.0"
```
I expected v1.20.1-rc.0 for newst
3- it didnt run "make generate-docs"
|
process
|
fix update kubernetes version script it wont run if u run it from root folder have to cd into it it didnt update to newest version defaultkubernetesversion newestkubernetesversion i expected rc for newst it didnt run make generate docs
| 1
|
6,803
| 15,357,282,738
|
IssuesEvent
|
2021-03-01 13:30:32
|
dfds/backstage
|
https://api.github.com/repos/dfds/backstage
|
opened
|
Create Crossplane XRD for easily deploying the ServiceProxy operator across our clusters
|
Architecture Refinement
|
- [ ] Create Crossplane provider
- [ ] Create XRD
.. Flesh out this task a bit more when we get to it and or have more knowledge of XRD and Crossplane providers.
|
1.0
|
Create Crossplane XRD for easily deploying the ServiceProxy operator across our clusters - - [ ] Create Crossplane provider
- [ ] Create XRD
.. Flesh out this task a bit more when we get to it and or have more knowledge of XRD and Crossplane providers.
|
non_process
|
create crossplane xrd for easily deploying the serviceproxy operator across our clusters create crossplane provider create xrd flesh out this task a bit more when we get to it and or have more knowledge of xrd and crossplane providers
| 0
|
372,971
| 11,031,224,954
|
IssuesEvent
|
2019-12-06 17:16:56
|
kubernetes-sigs/cluster-api
|
https://api.github.com/repos/kubernetes-sigs/cluster-api
|
closed
|
machine-controller problem self-detection
|
area/machine kind/feature priority/important-longterm
|
/kind feature
**Describe the solution you'd like**
In v1alpha1, as well as v1alpha2, the machine-controller can get 'stuck' in a variety of states. For instance, the state of a machine might never progress past "creating" (or as in v1alpha1, each reconcile loop always results in calling the Create() function of the actuator interface) state. The reasons for this can vary, but we don't currently have any way to track things that are 'stuck'.
We should implement functionality in the various controllers that can smartly detect when a state has not progressed after a certain configurable interval of time (or other specific condition) has passed. We should implement a method of notifying users of this condition.
Since kubernetes does not have any built-in alarming functionality, one way we could notify users is with labels. We could add specific labels to machine-objects (or other cluster-api objects) as a primary indication of a problem. The reason we might use labels instead of a status field or annotation is because they are easy to filter for the end-user (someone can write a script to filter for just problematic objects, or collect metrics and set an alarm for # of objects with the label). Unfortunately, CRDs do not support filtering as deeply as native kubernetes objects: https://github.com/kubernetes/kubernetes/issues/53459
**Anything else you would like to add:**
This logic probably best belongs inside the machine-controller (or other respective controller) because that controller knows the most about any given machine-object at any given time. We already have a reconcile loop that periodically updates machines, so there's not a great reason to re-invent this inspection elsewhere. On especially busy clusters that frequently scale up and down machines, users need a quick way to determine if there are any problems and where those problems are.
|
1.0
|
machine-controller problem self-detection - /kind feature
**Describe the solution you'd like**
In v1alpha1, as well as v1alpha2, the machine-controller can get 'stuck' in a variety of states. For instance, the state of a machine might never progress past "creating" (or as in v1alpha1, each reconcile loop always results in calling the Create() function of the actuator interface) state. The reasons for this can vary, but we don't currently have any way to track things that are 'stuck'.
We should implement functionality in the various controllers that can smartly detect when a state has not progressed after a certain configurable interval of time (or other specific condition) has passed. We should implement a method of notifying users of this condition.
Since kubernetes does not have any built-in alarming functionality, one way we could notify users is with labels. We could add specific labels to machine-objects (or other cluster-api objects) as a primary indication of a problem. The reason we might use labels instead of a status field or annotation is because they are easy to filter for the end-user (someone can write a script to filter for just problematic objects, or collect metrics and set an alarm for # of objects with the label). Unfortunately, CRDs do not support filtering as deeply as native kubernetes objects: https://github.com/kubernetes/kubernetes/issues/53459
**Anything else you would like to add:**
This logic probably best belongs inside the machine-controller (or other respective controller) because that controller knows the most about any given machine-object at any given time. We already have a reconcile loop that periodically updates machines, so there's not a great reason to re-invent this inspection elsewhere. On especially busy clusters that frequently scale up and down machines, users need a quick way to determine if there are any problems and where those problems are.
|
non_process
|
machine controller problem self detection kind feature describe the solution you d like in as well as the machine controller can get stuck in a variety of states for instance the state of a machine might never progress past creating or as in each reconcile loop always results in calling the create function of the actuator interface state the reasons for this can vary but we don t currently have any way to track things that are stuck we should implement functionality in the various controllers that can smartly detect when a state has not progressed after a certain configurable interval of time or other specific condition has passed we should implement a method of notifying users of this condition since kubernetes does not have any built in alarming functionality one way we could notify users is with labels we could add specific labels to machine objects or other cluster api objects as a primary indication of a problem the reason we might use labels instead of a status field or annotation is because they are easy to filter for the end user someone can write a script to filter for just problematic objects or collect metrics and set an alarm for of objects with the label unfortunately crds do not support filtering as deeply as native kubernetes objects anything else you would like to add this logic probably best belongs inside the machine controller or other respective controller because that controller knows the most about any given machine object at any given time we already have a reconcile loop that periodically updates machines so there s not a great reason to re invent this inspection elsewhere on especially busy clusters that frequently scale up and down machines users need a quick way to determine if there are any problems and where those problems are
| 0
|
12,964
| 15,341,717,151
|
IssuesEvent
|
2021-02-27 13:14:42
|
amor71/LiuAlgoTrader
|
https://api.github.com/repos/amor71/LiuAlgoTrader
|
closed
|
enhance backtest to make it zipline-like
|
enhancement in-process
|
**Is your feature request related to a problem? Please describe.**
Unlike zipline back-testing capabilities, LiuAlgoTrader back-tester re-play a past trading session. However, when developing new strategies, specifically for swing-trading over a period of time, current LiuAlgoTrader back-testing capabilities fall short.
**Describe the solution you'd like**
Extend the `backtester` app to receive a time-frame and run strategies (similar to how zipline works)
|
1.0
|
enhance backtest to make it zipline-like - **Is your feature request related to a problem? Please describe.**
Unlike zipline back-testing capabilities, LiuAlgoTrader back-tester re-play a past trading session. However, when developing new strategies, specifically for swing-trading over a period of time, current LiuAlgoTrader back-testing capabilities fall short.
**Describe the solution you'd like**
Extend the `backtester` app to receive a time-frame and run strategies (similar to how zipline works)
|
process
|
enhance backtest to make it zipline like is your feature request related to a problem please describe unlike zipline back testing capabilities liualgotrader back tester re play a past trading session however when developing new strategies specifically for swing trading over a period of time current liualgotrader back testing capabilities fall short describe the solution you d like extend the backtester app to receive a time frame and run strategies similar to how zipline works
| 1
|
324
| 2,772,018,539
|
IssuesEvent
|
2015-05-02 07:42:51
|
dineshkummarc/DoctorX2.0
|
https://api.github.com/repos/dineshkummarc/DoctorX2.0
|
opened
|
New process and schema for locking bugs into github.
|
Process-Management
|
h5. Bug Summary:
h5. Environment:
*
h5.Prerequisites:
*
h5. Steps to reproduce:
h5. Expected result:
h5. Actual result:
h5. Comments:
1. Please find attached screenshots '' for more detail info.
|
1.0
|
New process and schema for locking bugs into github. - h5. Bug Summary:
h5. Environment:
*
h5.Prerequisites:
*
h5. Steps to reproduce:
h5. Expected result:
h5. Actual result:
h5. Comments:
1. Please find attached screenshots '' for more detail info.
|
process
|
new process and schema for locking bugs into github bug summary environment prerequisites steps to reproduce expected result actual result comments please find attached screenshots for more detail info
| 1
|
184,942
| 14,290,909,225
|
IssuesEvent
|
2020-11-23 21:42:25
|
NVIDIA/spark-rapids
|
https://api.github.com/repos/NVIDIA/spark-rapids
|
closed
|
[BUG] Unchecked type warning in SparkQueryCompareTestSuite
|
P2 bug test
|
**Describe the bug**
Compiling the code results in this warning in SparkQueryCompareTestSuite:
```
[WARNING] /home/jlowe/src/rapids-plugin-4-spark/tests/src/test/scala/com/nvidia/spark/rapids/SparkQueryCompareTestSuite.scala:821: abstract type T is unchecked since it is eliminated by erasure
[WARNING] case Failure(e) if e.isInstanceOf[T] => {
```
If I'm reading that error message correctly, the condition on that `case` is not being treated properly.
|
1.0
|
[BUG] Unchecked type warning in SparkQueryCompareTestSuite - **Describe the bug**
Compiling the code results in this warning in SparkQueryCompareTestSuite:
```
[WARNING] /home/jlowe/src/rapids-plugin-4-spark/tests/src/test/scala/com/nvidia/spark/rapids/SparkQueryCompareTestSuite.scala:821: abstract type T is unchecked since it is eliminated by erasure
[WARNING] case Failure(e) if e.isInstanceOf[T] => {
```
If I'm reading that error message correctly, the condition on that `case` is not being treated properly.
|
non_process
|
unchecked type warning in sparkquerycomparetestsuite describe the bug compiling the code results in this warning in sparkquerycomparetestsuite home jlowe src rapids plugin spark tests src test scala com nvidia spark rapids sparkquerycomparetestsuite scala abstract type t is unchecked since it is eliminated by erasure case failure e if e isinstanceof if i m reading that error message correctly the condition on that case is not being treated properly
| 0
|
181,704
| 6,663,458,362
|
IssuesEvent
|
2017-10-02 16:27:54
|
mozilla/addons-frontend
|
https://api.github.com/repos/mozilla/addons-frontend
|
closed
|
Add additional data to the homepage
|
component: homepage priority: mvp project: 2017 Q3 project: desktop pages state: pull request ready triaged
|
We want to have other data on the homepage aside from the carousel and extension and theme categories.
We need to decide on that that data should be - which is a question for editorial. It should be data we already have as much as possible. So most popular, featured etc.
Here's the mock which doesn't have much on what the data should be
<img width="709" alt="amo_desktop_-_master_sketch" src="https://user-images.githubusercontent.com/1514/30451222-e371bc72-998a-11e7-9e0c-ba2a6cc97fc8.png">
|
1.0
|
Add additional data to the homepage - We want to have other data on the homepage aside from the carousel and extension and theme categories.
We need to decide on that that data should be - which is a question for editorial. It should be data we already have as much as possible. So most popular, featured etc.
Here's the mock which doesn't have much on what the data should be
<img width="709" alt="amo_desktop_-_master_sketch" src="https://user-images.githubusercontent.com/1514/30451222-e371bc72-998a-11e7-9e0c-ba2a6cc97fc8.png">
|
non_process
|
add additional data to the homepage we want to have other data on the homepage aside from the carousel and extension and theme categories we need to decide on that that data should be which is a question for editorial it should be data we already have as much as possible so most popular featured etc here s the mock which doesn t have much on what the data should be img width alt amo desktop master sketch src
| 0
|
249,884
| 18,858,246,426
|
IssuesEvent
|
2021-11-12 09:32:55
|
bnjmnt4n/pe
|
https://api.github.com/repos/bnjmnt4n/pe
|
opened
|
User guide: Parameters section is overwhelming
|
severity.Low type.DocumentationBug
|
Immediately after describing the command formats, a long list of parameters are displayed. This can be very overwhelming for first-time users, as there is a huge list of information about the fields to parse even before they are fully aware of what commands are available and what fields to use.
There is also a column called "Application field", which is confusing since it has not been defined, and the user does not know yet that there are different kinds of commands, which do not deal with editing application data.
There should have been a brief description of what an application contains, perhaps along with the GUI, so the user is at least aware what application fields are available. Instead of listing the parameters in alphabetical order, they could also perhaps be split into application fields and non-application fields to add a clearer distinction.
<!--session: 1636703947846-5b6d2c9e-cf95-4bb5-aa0f-3e8d70e1f2d7-->
<!--Version: Web v3.4.1-->
|
1.0
|
User guide: Parameters section is overwhelming - Immediately after describing the command formats, a long list of parameters are displayed. This can be very overwhelming for first-time users, as there is a huge list of information about the fields to parse even before they are fully aware of what commands are available and what fields to use.
There is also a column called "Application field", which is confusing since it has not been defined, and the user does not know yet that there are different kinds of commands, which do not deal with editing application data.
There should have been a brief description of what an application contains, perhaps along with the GUI, so the user is at least aware what application fields are available. Instead of listing the parameters in alphabetical order, they could also perhaps be split into application fields and non-application fields to add a clearer distinction.
<!--session: 1636703947846-5b6d2c9e-cf95-4bb5-aa0f-3e8d70e1f2d7-->
<!--Version: Web v3.4.1-->
|
non_process
|
user guide parameters section is overwhelming immediately after describing the command formats a long list of parameters are displayed this can be very overwhelming for first time users as there is a huge list of information about the fields to parse even before they are fully aware of what commands are available and what fields to use there is also a column called application field which is confusing since it has not been defined and the user does not know yet that there are different kinds of commands which do not deal with editing application data there should have been a brief description of what an application contains perhaps along with the gui so the user is at least aware what application fields are available instead of listing the parameters in alphabetical order they could also perhaps be split into application fields and non application fields to add a clearer distinction
| 0
|
1,442
| 4,007,794,450
|
IssuesEvent
|
2016-05-12 19:22:32
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process.spawnSync() seems to be truncating arguments containing '&'
|
child_process windows
|
* **Version**: 5.10.1, 6.1.0
* **Platform**: Win 10 x64
* **Subsystem**:
Using the child_process.spawnSync() function call, the args I pass are being cut if they contain an & in them. Here's my snippet:
```js
var child=require('child_process');
var args =[];
args [0]="a";
args [1]="b";
args [2]="\"a&b\"";
var r = child.spawnSync("c:/temp/test.cmd", args);
r.stdout.toString();
```
Here's the output:
```
> var child=require('child_process');
undefined
> var args =[];
undefined
> args [0]="a";
'a'
> args [1]="b";
'b'
> args [2]="\"a&b\"";
'"a&b"'
> var r = child.spawnSync("c:/temp/test.cmd", args);
undefined
> r.stdout.toString();
'\r\nC:\\Program Files\\nodejs6.1.0>echo args=a b "\\"a \r\nargs=a b "\\"a\r\n'
```
Notice that the 3rd arg, "a&b" is passed only as "a
here's the contents of test.cmd:
```
echo args=%*
```
And here's the same output from commandline:
```
C:\temp>test a b "a&b"
C:\temp>echo args=a b "a&b"
args=a b "a&b"
```
[edit: fixed formatting - bnoordhuis]
|
1.0
|
child_process.spawnSync() seems to be truncating arguments containing '&' - * **Version**: 5.10.1, 6.1.0
* **Platform**: Win 10 x64
* **Subsystem**:
Using the child_process.spawnSync() function call, the args I pass are being cut if they contain an & in them. Here's my snippet:
```js
var child=require('child_process');
var args =[];
args [0]="a";
args [1]="b";
args [2]="\"a&b\"";
var r = child.spawnSync("c:/temp/test.cmd", args);
r.stdout.toString();
```
Here's the output:
```
> var child=require('child_process');
undefined
> var args =[];
undefined
> args [0]="a";
'a'
> args [1]="b";
'b'
> args [2]="\"a&b\"";
'"a&b"'
> var r = child.spawnSync("c:/temp/test.cmd", args);
undefined
> r.stdout.toString();
'\r\nC:\\Program Files\\nodejs6.1.0>echo args=a b "\\"a \r\nargs=a b "\\"a\r\n'
```
Notice that the 3rd arg, "a&b" is passed only as "a
here's the contents of test.cmd:
```
echo args=%*
```
And here's the same output from commandline:
```
C:\temp>test a b "a&b"
C:\temp>echo args=a b "a&b"
args=a b "a&b"
```
[edit: fixed formatting - bnoordhuis]
|
process
|
child process spawnsync seems to be truncating arguments containing version platform win subsystem using the child process spawnsync function call the args i pass are being cut if they contain an in them here s my snippet js var child require child process var args args a args b args a b var r child spawnsync c temp test cmd args r stdout tostring here s the output var child require child process undefined var args undefined args a a args b b args a b a b var r child spawnsync c temp test cmd args undefined r stdout tostring r nc program files echo args a b a r nargs a b a r n notice that the arg a b is passed only as a here s the contents of test cmd echo args and here s the same output from commandline c temp test a b a b c temp echo args a b a b args a b a b
| 1
|
117,908
| 4,728,898,906
|
IssuesEvent
|
2016-10-18 17:11:29
|
MRN-Code/penny-collector
|
https://api.github.com/repos/MRN-Code/penny-collector
|
closed
|
Investigate configuration passing to PennyCollector#collect
|
high priority question
|
This seems odd: https://github.com/MRN-Code/penny-collector/blob/5ec7c0c0ee03f7c07f8d5863200f86cb5eac183d/src/cli.js#L53. Why aren’t we passing configuration during instantiation?
|
1.0
|
Investigate configuration passing to PennyCollector#collect - This seems odd: https://github.com/MRN-Code/penny-collector/blob/5ec7c0c0ee03f7c07f8d5863200f86cb5eac183d/src/cli.js#L53. Why aren’t we passing configuration during instantiation?
|
non_process
|
investigate configuration passing to pennycollector collect this seems odd why aren’t we passing configuration during instantiation
| 0
|
8,836
| 11,947,139,037
|
IssuesEvent
|
2020-04-03 09:24:01
|
JuliaRobotics/IncrementalInference.jl
|
https://api.github.com/repos/JuliaRobotics/IncrementalInference.jl
|
closed
|
temporarily removed addprocs from runtests.jl
|
multiprocess needs testing regression
|
this suddenly seems to fail for some reason:
```julia
using IncrementalInference
...
addprocs(3)
...
using IncrementalInference
```
|
1.0
|
temporarily removed addprocs from runtests.jl - this suddenly seems to fail for some reason:
```julia
using IncrementalInference
...
addprocs(3)
...
using IncrementalInference
```
|
process
|
temporarily removed addprocs from runtests jl this suddenly seems to fail for some reason julia using incrementalinference addprocs using incrementalinference
| 1
|
10,449
| 12,402,980,714
|
IssuesEvent
|
2020-05-21 13:06:26
|
ProgVal/Limnoria
|
https://api.github.com/repos/ProgVal/Limnoria
|
closed
|
IRC connection dying due to uncaught exception
|
Bug Python 2 compatibility
|
My bot dies every once in a while with this message:
ERROR 2019-03-26T15:43:15 supybot Uncaught exception in in drivers.run:
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/supybot/drivers/__init__.py", line 104, in run
driver.run()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 195, in run
self._select()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 171, in _select
instance._read()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 222, in _read
self._sendIfMsgs()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 135, in _sendIfMsgs
self.outbuffer += ''.join(map(str, msgs))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 130: ordinal not in range(128)
ERROR 2019-03-26T15:43:15 supybot Exception id: 0xd0be9
INFO 2019-03-26T15:43:15 supybot Removing driver SocketDriver(Irc object for IrcServer).
INFO 2019-03-26T16:14:00 supybot Flushers flushed and garbage collected.
Apparently there has been a different bug related to mishandling of UTF-8 characters.
https://github.com/ProgVal/Limnoria/issues/1230
I'm wondering if this is the same type of problem, just in a different situation.
|
True
|
IRC connection dying due to uncaught exception - My bot dies every once in a while with this message:
ERROR 2019-03-26T15:43:15 supybot Uncaught exception in in drivers.run:
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/supybot/drivers/__init__.py", line 104, in run
driver.run()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 195, in run
self._select()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 171, in _select
instance._read()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 222, in _read
self._sendIfMsgs()
File "/usr/lib/python2.7/site-packages/supybot/drivers/Socket.py", line 135, in _sendIfMsgs
self.outbuffer += ''.join(map(str, msgs))
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 130: ordinal not in range(128)
ERROR 2019-03-26T15:43:15 supybot Exception id: 0xd0be9
INFO 2019-03-26T15:43:15 supybot Removing driver SocketDriver(Irc object for IrcServer).
INFO 2019-03-26T16:14:00 supybot Flushers flushed and garbage collected.
Apparently there has been a different bug related to mishandling of UTF-8 characters.
https://github.com/ProgVal/Limnoria/issues/1230
I'm wondering if this is the same type of problem, just in a different situation.
|
non_process
|
irc connection dying due to uncaught exception my bot dies every once in a while with this message error supybot uncaught exception in in drivers run traceback most recent call last file usr lib site packages supybot drivers init py line in run driver run file usr lib site packages supybot drivers socket py line in run self select file usr lib site packages supybot drivers socket py line in select instance read file usr lib site packages supybot drivers socket py line in read self sendifmsgs file usr lib site packages supybot drivers socket py line in sendifmsgs self outbuffer join map str msgs unicodedecodeerror ascii codec can t decode byte in position ordinal not in range error supybot exception id info supybot removing driver socketdriver irc object for ircserver info supybot flushers flushed and garbage collected apparently there has been a different bug related to mishandling of utf characters i m wondering if this is the same type of problem just in a different situation
| 0
|
409,168
| 11,957,972,186
|
IssuesEvent
|
2020-04-04 16:15:51
|
wevote/WebApp
|
https://api.github.com/repos/wevote/WebApp
|
opened
|
Limit the number of issue icons to show in popover
|
Difficulty: Easy Priority: 2
|
1. Switch to US Midterm 2018 election
2. Find Dianne Feinstein
3. Click on the issue "Social Security & Medicare" under her
We need to limit the number of issues that show up next to a group (I think 4 should work, but it may need to be 3), and show a "..." after that.

|
1.0
|
Limit the number of issue icons to show in popover - 1. Switch to US Midterm 2018 election
2. Find Dianne Feinstein
3. Click on the issue "Social Security & Medicare" under her
We need to limit the number of issues that show up next to a group (I think 4 should work, but it may need to be 3), and show a "..." after that.

|
non_process
|
limit the number of issue icons to show in popover switch to us midterm election find dianne feinstein click on the issue social security medicare under her we need to limit the number of issues that show up next to a group i think should work but it may need to be and show a after that
| 0
|
88,085
| 15,800,728,590
|
IssuesEvent
|
2021-04-03 01:01:16
|
jgeraigery/gradle
|
https://api.github.com/repos/jgeraigery/gradle
|
opened
|
CVE-2021-28165 (High) detected in jetty-io-9.4.5.v20170502.jar, jetty-io-9.2.28.v20190418.jar
|
security vulnerability
|
## CVE-2021-28165 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jetty-io-9.4.5.v20170502.jar</b>, <b>jetty-io-9.2.28.v20190418.jar</b></p></summary>
<p>
<details><summary><b>jetty-io-9.4.5.v20170502.jar</b></p></summary>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/org.eclipse.jetty/jetty-io/9.4.5.v20170502/76086f955d4e943396b8f340fd5bae3ce4da19d9/jetty-io-9.4.5.v20170502.jar</p>
<p>
Dependency Hierarchy:
- play-test_2.12-2.6.15.jar (Root Library)
- htmlunit-driver-2.27.jar
- htmlunit-2.27.jar
- websocket-client-9.4.5.v20170502.jar
- :x: **jetty-io-9.4.5.v20170502.jar** (Vulnerable Library)
</details>
<details><summary><b>jetty-io-9.2.28.v20190418.jar</b></p></summary>
<p>Administrative parent pom for Jetty modules</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/samples/credentials-handling/publishing-credentials/groovy/maven-repository-stub/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/org.eclipse.jetty/jetty-io/9.2.28.v20190418/b9219469ee48d03436684890acac89a1053afbcf/jetty-io-9.2.28.v20190418.jar</p>
<p>
Dependency Hierarchy:
- wiremock-2.26.3.jar (Root Library)
- jetty-server-9.2.28.v20190418.jar
- :x: **jetty-io-9.2.28.v20190418.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.
<p>Publish Date: 2021-04-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165>CVE-2021-28165</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w">https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w</a></p>
<p>Release Date: 2021-04-01</p>
<p>Fix Resolution: org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.eclipse.jetty","packageName":"jetty-io","packageVersion":"9.4.5.v20170502","packageFilePaths":["/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-test_2.12:2.6.15;org.seleniumhq.selenium:htmlunit-driver:2.27;net.sourceforge.htmlunit:htmlunit:2.27;org.eclipse.jetty.websocket:websocket-client:9.4.5.v20170502;org.eclipse.jetty:jetty-io:9.4.5.v20170502","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2"},{"packageType":"Java","groupId":"org.eclipse.jetty","packageName":"jetty-io","packageVersion":"9.2.28.v20190418","packageFilePaths":["/subprojects/docs/src/samples/credentials-handling/publishing-credentials/groovy/maven-repository-stub/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock:2.26.3;org.eclipse.jetty:jetty-server:9.2.28.v20190418;org.eclipse.jetty:jetty-io:9.2.28.v20190418","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-28165","vulnerabilityDetails":"In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-28165 (High) detected in jetty-io-9.4.5.v20170502.jar, jetty-io-9.2.28.v20190418.jar - ## CVE-2021-28165 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jetty-io-9.4.5.v20170502.jar</b>, <b>jetty-io-9.2.28.v20190418.jar</b></p></summary>
<p>
<details><summary><b>jetty-io-9.4.5.v20170502.jar</b></p></summary>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/org.eclipse.jetty/jetty-io/9.4.5.v20170502/76086f955d4e943396b8f340fd5bae3ce4da19d9/jetty-io-9.4.5.v20170502.jar</p>
<p>
Dependency Hierarchy:
- play-test_2.12-2.6.15.jar (Root Library)
- htmlunit-driver-2.27.jar
- htmlunit-2.27.jar
- websocket-client-9.4.5.v20170502.jar
- :x: **jetty-io-9.4.5.v20170502.jar** (Vulnerable Library)
</details>
<details><summary><b>jetty-io-9.2.28.v20190418.jar</b></p></summary>
<p>Administrative parent pom for Jetty modules</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/samples/credentials-handling/publishing-credentials/groovy/maven-repository-stub/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/org.eclipse.jetty/jetty-io/9.2.28.v20190418/b9219469ee48d03436684890acac89a1053afbcf/jetty-io-9.2.28.v20190418.jar</p>
<p>
Dependency Hierarchy:
- wiremock-2.26.3.jar (Root Library)
- jetty-server-9.2.28.v20190418.jar
- :x: **jetty-io-9.2.28.v20190418.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.
<p>Publish Date: 2021-04-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165>CVE-2021-28165</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w">https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w</a></p>
<p>Release Date: 2021-04-01</p>
<p>Fix Resolution: org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.eclipse.jetty","packageName":"jetty-io","packageVersion":"9.4.5.v20170502","packageFilePaths":["/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-test_2.12:2.6.15;org.seleniumhq.selenium:htmlunit-driver:2.27;net.sourceforge.htmlunit:htmlunit:2.27;org.eclipse.jetty.websocket:websocket-client:9.4.5.v20170502;org.eclipse.jetty:jetty-io:9.4.5.v20170502","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2"},{"packageType":"Java","groupId":"org.eclipse.jetty","packageName":"jetty-io","packageVersion":"9.2.28.v20190418","packageFilePaths":["/subprojects/docs/src/samples/credentials-handling/publishing-credentials/groovy/maven-repository-stub/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock:2.26.3;org.eclipse.jetty:jetty-server:9.2.28.v20190418;org.eclipse.jetty:jetty-io:9.2.28.v20190418","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.eclipse.jetty:jetty-io:9.4.39, org.eclipse.jetty:jetty-io:10.0.2, org.eclipse.jetty:jetty-io:11.0.2"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-28165","vulnerabilityDetails":"In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in jetty io jar jetty io jar cve high severity vulnerability vulnerable libraries jetty io jar jetty io jar jetty io jar the eclipse jetty project library home page a href path to dependency file gradle subprojects docs src snippets play multiproject groovy modules user build gradle path to vulnerable library gradle caches modules files org eclipse jetty jetty io jetty io jar dependency hierarchy play test jar root library htmlunit driver jar htmlunit jar websocket client jar x jetty io jar vulnerable library jetty io jar administrative parent pom for jetty modules library home page a href path to dependency file gradle subprojects docs src samples credentials handling publishing credentials groovy maven repository stub build gradle path to vulnerable library gradle caches modules files org eclipse jetty jetty io jetty io jar dependency hierarchy wiremock jar root library jetty server jar x jetty io jar vulnerable library vulnerability details in eclipse jetty to to and to cpu usage can reach upon receiving a large invalid tls frame publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org eclipse jetty jetty io org eclipse jetty jetty io org eclipse jetty jetty io isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com typesafe play play test org seleniumhq selenium htmlunit driver net sourceforge htmlunit htmlunit org eclipse jetty websocket websocket client org eclipse jetty jetty io isminimumfixversionavailable true minimumfixversion org eclipse jetty jetty io org eclipse jetty jetty io org eclipse jetty jetty io packagetype java groupid org eclipse jetty packagename jetty io packageversion packagefilepaths istransitivedependency true dependencytree com github tomakehurst wiremock org eclipse jetty jetty server org eclipse jetty jetty io isminimumfixversionavailable true minimumfixversion org eclipse jetty jetty io org eclipse jetty jetty io org eclipse jetty jetty io basebranches vulnerabilityidentifier cve vulnerabilitydetails in eclipse jetty to to and to cpu usage can reach upon receiving a large invalid tls frame vulnerabilityurl
| 0
|
239,240
| 19,831,535,067
|
IssuesEvent
|
2022-01-20 12:31:54
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
opened
|
Failing test: Chrome UI Functional Tests.test/functional/apps/console/_console·ts - console app console app should add comma after previous non empty line on autocomplete
|
failed-test
|
A test failed on a tracked branch
```
Error: retry.try timeout: Error: expected '{' to equal ','
at Assertion.assert (node_modules/@kbn/expect/expect.js:100:11)
at Assertion.be.Assertion.equal (node_modules/@kbn/expect/expect.js:227:8)
at Function.equal (node_modules/@kbn/expect/expect.js:531:15)
at /opt/local-ssd/buildkite/builds/kb-cigroup-4d-e738864bd76398a6/elastic/kibana-hourly/kibana/test/functional/apps/console/_console.ts:110:32
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at runAttempt (test/common/services/retry/retry_for_success.ts:29:15)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:68:21)
at RetryService.try (test/common/services/retry/retry.ts:31:12)
at Context.<anonymous> (test/functional/apps/console/_console.ts:106:7)
at onFailure (test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:59:13)
at RetryService.try (test/common/services/retry/retry.ts:31:12)
at Context.<anonymous> (test/functional/apps/console/_console.ts:106:7)
at Object.apply (node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-hourly/builds/8135#1807a1a1-9f0c-4fdd-b03b-f6882105c10c)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/console/_console·ts","test.name":"console app console app should add comma after previous non empty line on autocomplete","test.failCount":1}} -->
|
1.0
|
Failing test: Chrome UI Functional Tests.test/functional/apps/console/_console·ts - console app console app should add comma after previous non empty line on autocomplete - A test failed on a tracked branch
```
Error: retry.try timeout: Error: expected '{' to equal ','
at Assertion.assert (node_modules/@kbn/expect/expect.js:100:11)
at Assertion.be.Assertion.equal (node_modules/@kbn/expect/expect.js:227:8)
at Function.equal (node_modules/@kbn/expect/expect.js:531:15)
at /opt/local-ssd/buildkite/builds/kb-cigroup-4d-e738864bd76398a6/elastic/kibana-hourly/kibana/test/functional/apps/console/_console.ts:110:32
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at runAttempt (test/common/services/retry/retry_for_success.ts:29:15)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:68:21)
at RetryService.try (test/common/services/retry/retry.ts:31:12)
at Context.<anonymous> (test/functional/apps/console/_console.ts:106:7)
at onFailure (test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:59:13)
at RetryService.try (test/common/services/retry/retry.ts:31:12)
at Context.<anonymous> (test/functional/apps/console/_console.ts:106:7)
at Object.apply (node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-hourly/builds/8135#1807a1a1-9f0c-4fdd-b03b-f6882105c10c)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/console/_console·ts","test.name":"console app console app should add comma after previous non empty line on autocomplete","test.failCount":1}} -->
|
non_process
|
failing test chrome ui functional tests test functional apps console console·ts console app console app should add comma after previous non empty line on autocomplete a test failed on a tracked branch error retry try timeout error expected to equal at assertion assert node modules kbn expect expect js at assertion be assertion equal node modules kbn expect expect js at function equal node modules kbn expect expect js at opt local ssd buildkite builds kb cigroup elastic kibana hourly kibana test functional apps console console ts at runmicrotasks at processticksandrejections node internal process task queues at runattempt test common services retry retry for success ts at retryforsuccess test common services retry retry for success ts at retryservice try test common services retry retry ts at context test functional apps console console ts at onfailure test common services retry retry for success ts at retryforsuccess test common services retry retry for success ts at retryservice try test common services retry retry ts at context test functional apps console console ts at object apply node modules kbn test target node functional test runner lib mocha wrap function js first failure
| 0
|
156,217
| 5,965,424,868
|
IssuesEvent
|
2017-05-30 11:34:20
|
Shubham2301/phase2
|
https://api.github.com/repos/Shubham2301/phase2
|
closed
|
Additional Event attributes
|
priority : medium type : enhancement
|
1. Date of event
2. Last date of registration
3. Venue(s)
4. Event banner
|
1.0
|
Additional Event attributes - 1. Date of event
2. Last date of registration
3. Venue(s)
4. Event banner
|
non_process
|
additional event attributes date of event last date of registration venue s event banner
| 0
|
242,646
| 18,669,579,231
|
IssuesEvent
|
2021-10-30 12:56:15
|
OptiSchmopti/CsvProc9000
|
https://api.github.com/repos/OptiSchmopti/CsvProc9000
|
closed
|
Installation Guide is not clear
|
bug documentation
|
starting the bat as Admin is not the only thing you have to do, because the path is wrong then.
you have to start an admin cmd/ps, navigate there, and call the bat
|
1.0
|
Installation Guide is not clear - starting the bat as Admin is not the only thing you have to do, because the path is wrong then.
you have to start an admin cmd/ps, navigate there, and call the bat
|
non_process
|
installation guide is not clear starting the bat as admin is not the only thing you have to do because the path is wrong then you have to start an admin cmd ps navigate there and call the bat
| 0
|
181,420
| 21,658,662,050
|
IssuesEvent
|
2022-05-06 16:37:49
|
TIBCOSoftware/MqttStreamUsingNanoscale
|
https://api.github.com/repos/TIBCOSoftware/MqttStreamUsingNanoscale
|
closed
|
WS-2018-0232 (Medium) detected in multiple libraries - autoclosed
|
security vulnerability
|
## WS-2018-0232 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>underscore.string-2.3.3.tgz</b>, <b>underscore.string-2.4.0.tgz</b>, <b>underscore.string-3.0.3.tgz</b>, <b>underscore.string-2.2.1.tgz</b></p></summary>
<p>
<details><summary><b>underscore.string-2.3.3.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.3.3.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.3.3.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/grunt-legacy-log-utils/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- grunt-legacy-log-0.1.3.tgz
- :x: **underscore.string-2.3.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-2.4.0.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.4.0.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.4.0.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/argparse/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- js-yaml-2.0.5.tgz
- argparse-0.1.16.tgz
- :x: **underscore.string-2.4.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-3.0.3.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-3.0.3.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-3.0.3.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/webcomponentsjs/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/webcomponentsjs/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- web-component-tester-4.3.7.tgz (Root Library)
- wd-0.3.12.tgz
- :x: **underscore.string-3.0.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-2.2.1.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.2.1.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.2.1.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- :x: **underscore.string-2.2.1.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Underscore.string, before 3.3.5, is vulnerable to Regular Expression Denial of Service (ReDoS).
<p>Publish Date: 2018-10-03
<p>URL: <a href=https://github.com/epeli/underscore.string/commit/f486cd684c94c12db48b45d52b1472a1b9661029>WS-2018-0232</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/745">https://www.npmjs.com/advisories/745</a></p>
<p>Release Date: 2018-12-30</p>
<p>Fix Resolution: 3.3.5</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.3.3","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;grunt-legacy-log:0.1.3;underscore.string:2.3.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.4.0","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;js-yaml:2.0.5;argparse:0.1.16;underscore.string:2.4.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"3.0.3","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/webcomponentsjs/package.json"],"isTransitiveDependency":true,"dependencyTree":"web-component-tester:4.3.7;wd:0.3.12;underscore.string:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.2.1","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;underscore.string:2.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2018-0232","vulnerabilityDetails":"Underscore.string, before 3.3.5, is vulnerable to Regular Expression Denial of Service (ReDoS).","vulnerabilityUrl":"https://github.com/epeli/underscore.string/commit/f486cd684c94c12db48b45d52b1472a1b9661029","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> -->
|
True
|
WS-2018-0232 (Medium) detected in multiple libraries - autoclosed - ## WS-2018-0232 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>underscore.string-2.3.3.tgz</b>, <b>underscore.string-2.4.0.tgz</b>, <b>underscore.string-3.0.3.tgz</b>, <b>underscore.string-2.2.1.tgz</b></p></summary>
<p>
<details><summary><b>underscore.string-2.3.3.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.3.3.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.3.3.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/grunt-legacy-log-utils/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- grunt-legacy-log-0.1.3.tgz
- :x: **underscore.string-2.3.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-2.4.0.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.4.0.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.4.0.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/argparse/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- js-yaml-2.0.5.tgz
- argparse-0.1.16.tgz
- :x: **underscore.string-2.4.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-3.0.3.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-3.0.3.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-3.0.3.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/webcomponentsjs/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/webcomponentsjs/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- web-component-tester-4.3.7.tgz (Root Library)
- wd-0.3.12.tgz
- :x: **underscore.string-3.0.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>underscore.string-2.2.1.tgz</b></p></summary>
<p>String manipulation extensions for Underscore.js javascript library.</p>
<p>Library home page: <a href="https://registry.npmjs.org/underscore.string/-/underscore.string-2.2.1.tgz">https://registry.npmjs.org/underscore.string/-/underscore.string-2.2.1.tgz</a></p>
<p>Path to dependency file: /MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json</p>
<p>Path to vulnerable library: MqttStreamUsingNanoscale/bower_components/promise-polyfill/node_modules/underscore.string/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- :x: **underscore.string-2.2.1.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Underscore.string, before 3.3.5, is vulnerable to Regular Expression Denial of Service (ReDoS).
<p>Publish Date: 2018-10-03
<p>URL: <a href=https://github.com/epeli/underscore.string/commit/f486cd684c94c12db48b45d52b1472a1b9661029>WS-2018-0232</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/745">https://www.npmjs.com/advisories/745</a></p>
<p>Release Date: 2018-12-30</p>
<p>Fix Resolution: 3.3.5</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.3.3","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;grunt-legacy-log:0.1.3;underscore.string:2.3.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.4.0","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;js-yaml:2.0.5;argparse:0.1.16;underscore.string:2.4.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"3.0.3","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/webcomponentsjs/package.json"],"isTransitiveDependency":true,"dependencyTree":"web-component-tester:4.3.7;wd:0.3.12;underscore.string:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"},{"packageType":"javascript/Node.js","packageName":"underscore.string","packageVersion":"2.2.1","packageFilePaths":["/MqttStreamUsingNanoscale/bower_components/promise-polyfill/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:0.4.5;underscore.string:2.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.3.5"}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2018-0232","vulnerabilityDetails":"Underscore.string, before 3.3.5, is vulnerable to Regular Expression Denial of Service (ReDoS).","vulnerabilityUrl":"https://github.com/epeli/underscore.string/commit/f486cd684c94c12db48b45d52b1472a1b9661029","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> -->
|
non_process
|
ws medium detected in multiple libraries autoclosed ws medium severity vulnerability vulnerable libraries underscore string tgz underscore string tgz underscore string tgz underscore string tgz underscore string tgz string manipulation extensions for underscore js javascript library library home page a href path to dependency file mqttstreamusingnanoscale bower components promise polyfill package json path to vulnerable library mqttstreamusingnanoscale bower components promise polyfill node modules grunt legacy log utils node modules underscore string package json dependency hierarchy grunt tgz root library grunt legacy log tgz x underscore string tgz vulnerable library underscore string tgz string manipulation extensions for underscore js javascript library library home page a href path to dependency file mqttstreamusingnanoscale bower components promise polyfill package json path to vulnerable library mqttstreamusingnanoscale bower components promise polyfill node modules argparse node modules underscore string package json dependency hierarchy grunt tgz root library js yaml tgz argparse tgz x underscore string tgz vulnerable library underscore string tgz string manipulation extensions for underscore js javascript library library home page a href path to dependency file mqttstreamusingnanoscale bower components webcomponentsjs package json path to vulnerable library mqttstreamusingnanoscale bower components webcomponentsjs node modules underscore string package json dependency hierarchy web component tester tgz root library wd tgz x underscore string tgz vulnerable library underscore string tgz string manipulation extensions for underscore js javascript library library home page a href path to dependency file mqttstreamusingnanoscale bower components promise polyfill package json path to vulnerable library mqttstreamusingnanoscale bower components promise polyfill node modules underscore string package json dependency hierarchy grunt tgz root library x underscore string tgz vulnerable library vulnerability details underscore string before is vulnerable to regular expression denial of service redos publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt grunt legacy log underscore string isminimumfixversionavailable true minimumfixversion packagetype javascript node js packagename underscore string packageversion packagefilepaths istransitivedependency true dependencytree grunt js yaml argparse underscore string isminimumfixversionavailable true minimumfixversion packagetype javascript node js packagename underscore string packageversion packagefilepaths istransitivedependency true dependencytree web component tester wd underscore string isminimumfixversionavailable true minimumfixversion packagetype javascript node js packagename underscore string packageversion packagefilepaths istransitivedependency true dependencytree grunt underscore string isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails underscore string before is vulnerable to regular expression denial of service redos vulnerabilityurl
| 0
|
16,623
| 21,678,126,921
|
IssuesEvent
|
2022-05-09 01:23:14
|
lynnandtonic/nestflix.fun
|
https://api.github.com/repos/lynnandtonic/nestflix.fun
|
closed
|
Add Terminal Beauty 3
|
suggested title in process
|
Please add as much of the following info as you can:
Title: Terminal Beauty 3
Type (film/tv show): film - action/detective
Film or show in which it appears: The Boys
Is the parent film/show streaming anywhere? Yes - Amazon Prime
About when in the parent film/show does it appear? Ep. 1x05 - "Good for the Soul." You can see the scenes from 24:10 - 24:33, 24:37 - 24:39 (though seen through a TV screen).
Actual footage of the film/show can be seen (yes/no)? Yes
Cast: Popclaw, Billy Zane
DVD back cover description: 'Popclaw shines as a razor-sharp private detective, unraveling a web of lies in the biggest case of her life. Billy Zane plays a smooth-talking con man with too many secrets and a dark, mysterious past. When these two find themselves caught up in a dangerous game of cat-and-mouse, all rules go out the window...'
|
1.0
|
Add Terminal Beauty 3 - Please add as much of the following info as you can:
Title: Terminal Beauty 3
Type (film/tv show): film - action/detective
Film or show in which it appears: The Boys
Is the parent film/show streaming anywhere? Yes - Amazon Prime
About when in the parent film/show does it appear? Ep. 1x05 - "Good for the Soul." You can see the scenes from 24:10 - 24:33, 24:37 - 24:39 (though seen through a TV screen).
Actual footage of the film/show can be seen (yes/no)? Yes
Cast: Popclaw, Billy Zane
DVD back cover description: 'Popclaw shines as a razor-sharp private detective, unraveling a web of lies in the biggest case of her life. Billy Zane plays a smooth-talking con man with too many secrets and a dark, mysterious past. When these two find themselves caught up in a dangerous game of cat-and-mouse, all rules go out the window...'
|
process
|
add terminal beauty please add as much of the following info as you can title terminal beauty type film tv show film action detective film or show in which it appears the boys is the parent film show streaming anywhere yes amazon prime about when in the parent film show does it appear ep good for the soul you can see the scenes from though seen through a tv screen actual footage of the film show can be seen yes no yes cast popclaw billy zane dvd back cover description popclaw shines as a razor sharp private detective unraveling a web of lies in the biggest case of her life billy zane plays a smooth talking con man with too many secrets and a dark mysterious past when these two find themselves caught up in a dangerous game of cat and mouse all rules go out the window
| 1
|
20,635
| 27,314,717,505
|
IssuesEvent
|
2023-02-24 14:52:38
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR co-transcriptional polyadenylation dependent mRNA 3'-end processing (or relabel GO:0006378 mRNA polyadenylation)
|
New term request RNA processes ready
|
Starting to address
https://github.com/geneontology/go-ontology/issues/24161
39+ terms !
We have LOTs of terms to represent cleavage/ or polyadenylation scattered around the ontology, but we do not appear to have a single term to represent this pathway.
~We need an explicit term
co-transcriptional polyadenylation dependent mRNA 3'-end processing
parent GO:0031124 mRNA 3'-end processing
Def
The transcription termination-coupled cotranscriptional 3’ processing of RNA polymerase II transcripts, involves 3′ end cleavage of nascent mRNAs and addition of the poly(A) tail.
This pathway is required for the maturation of primary protein-encoding transcripts into functional mRNAs that can be exported from the nucleus and translated in the cytoplasm, but is also required for polyadenylation-dependent decay.
synonym "cotranscriptional 3’ processing of RNA polymerase II transcripts" broad~
see below for final term
Note that we do have already equivalent terms for
GO:0071051 polyadenylation-dependent snoRNA 3'-end processing
( I have an objection to this axis of classification because it seems to be the same pathway, but we can park this for now)
At least if we have this term we can remove a bunch of the "polyadenylation" and"cleavage" terms.
~We could relabel
GO:0006378 mRNA polyadenylation
The enzymatic addition of a sequence of 40-200 adenylyl residues at the 3' end of a eukaryotic mRNA primary transcript.
which means pretty much the same thing (the renaming/revised definition would just make the purpose clearer because it includes the cleavage step)~
see below
This (GO:0006378) would otherwise need to be obsoeted in the clean up.
@pgaudet what do you think? I prefer to rename and it doesn't really change the meaning. It gives people a clear place to house the cleavage step in process.
|
1.0
|
NTR co-transcriptional polyadenylation dependent mRNA 3'-end processing (or relabel GO:0006378 mRNA polyadenylation) - Starting to address
https://github.com/geneontology/go-ontology/issues/24161
39+ terms !
We have LOTs of terms to represent cleavage/ or polyadenylation scattered around the ontology, but we do not appear to have a single term to represent this pathway.
~We need an explicit term
co-transcriptional polyadenylation dependent mRNA 3'-end processing
parent GO:0031124 mRNA 3'-end processing
Def
The transcription termination-coupled cotranscriptional 3’ processing of RNA polymerase II transcripts, involves 3′ end cleavage of nascent mRNAs and addition of the poly(A) tail.
This pathway is required for the maturation of primary protein-encoding transcripts into functional mRNAs that can be exported from the nucleus and translated in the cytoplasm, but is also required for polyadenylation-dependent decay.
synonym "cotranscriptional 3’ processing of RNA polymerase II transcripts" broad~
see below for final term
Note that we do have already equivalent terms for
GO:0071051 polyadenylation-dependent snoRNA 3'-end processing
( I have an objection to this axis of classification because it seems to be the same pathway, but we can park this for now)
At least if we have this term we can remove a bunch of the "polyadenylation" and"cleavage" terms.
~We could relabel
GO:0006378 mRNA polyadenylation
The enzymatic addition of a sequence of 40-200 adenylyl residues at the 3' end of a eukaryotic mRNA primary transcript.
which means pretty much the same thing (the renaming/revised definition would just make the purpose clearer because it includes the cleavage step)~
see below
This (GO:0006378) would otherwise need to be obsoeted in the clean up.
@pgaudet what do you think? I prefer to rename and it doesn't really change the meaning. It gives people a clear place to house the cleavage step in process.
|
process
|
ntr co transcriptional polyadenylation dependent mrna end processing or relabel go mrna polyadenylation starting to address terms we have lots of terms to represent cleavage or polyadenylation scattered around the ontology but we do not appear to have a single term to represent this pathway we need an explicit term co transcriptional polyadenylation dependent mrna end processing parent go mrna end processing def the transcription termination coupled cotranscriptional ’ processing of rna polymerase ii transcripts involves ′ end cleavage of nascent mrnas and addition of the poly a tail this pathway is required for the maturation of primary protein encoding transcripts into functional mrnas that can be exported from the nucleus and translated in the cytoplasm but is also required for polyadenylation dependent decay synonym cotranscriptional ’ processing of rna polymerase ii transcripts broad see below for final term note that we do have already equivalent terms for go polyadenylation dependent snorna end processing i have an objection to this axis of classification because it seems to be the same pathway but we can park this for now at least if we have this term we can remove a bunch of the polyadenylation and cleavage terms we could relabel go mrna polyadenylation the enzymatic addition of a sequence of adenylyl residues at the end of a eukaryotic mrna primary transcript which means pretty much the same thing the renaming revised definition would just make the purpose clearer because it includes the cleavage step see below this go would otherwise need to be obsoeted in the clean up pgaudet what do you think i prefer to rename and it doesn t really change the meaning it gives people a clear place to house the cleavage step in process
| 1
|
3,139
| 6,192,553,363
|
IssuesEvent
|
2017-07-05 02:29:17
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Parser & Resolver errors with VBA project using vbWatchDog
|
bug parse-tree-preprocessing
|
If a VBA project contains modules from vbWatchDog ( http://www.everythingaccess.com/vbwatchdog.asp ), parser will fail to parse ErrEx_Helper which contains empty stubs for internal uses.
Even if I hand-edit the module to use line breaks instead of colon for breaking statements, the parse phase will succeed but it will fail with resolver error (and won't go into details why it failed.
|
1.0
|
Parser & Resolver errors with VBA project using vbWatchDog - If a VBA project contains modules from vbWatchDog ( http://www.everythingaccess.com/vbwatchdog.asp ), parser will fail to parse ErrEx_Helper which contains empty stubs for internal uses.
Even if I hand-edit the module to use line breaks instead of colon for breaking statements, the parse phase will succeed but it will fail with resolver error (and won't go into details why it failed.
|
process
|
parser resolver errors with vba project using vbwatchdog if a vba project contains modules from vbwatchdog parser will fail to parse errex helper which contains empty stubs for internal uses even if i hand edit the module to use line breaks instead of colon for breaking statements the parse phase will succeed but it will fail with resolver error and won t go into details why it failed
| 1
|
80,805
| 15,585,966,726
|
IssuesEvent
|
2021-03-18 00:52:03
|
anonymous-032021/nfl-rushing
|
https://api.github.com/repos/anonymous-032021/nfl-rushing
|
opened
|
Force type of records on ES
|
code enhancement
|
Currently the stupid simple script coming out of #2 doesn't do any type forcing on the records.
This means that if records have different types, but can be coerced ( long vs float for example ) a mapping issue could occur.
This issue should be closed when each record is coerced in the `.map` of that waterfall to enforce the exact type that is valid.
|
1.0
|
Force type of records on ES - Currently the stupid simple script coming out of #2 doesn't do any type forcing on the records.
This means that if records have different types, but can be coerced ( long vs float for example ) a mapping issue could occur.
This issue should be closed when each record is coerced in the `.map` of that waterfall to enforce the exact type that is valid.
|
non_process
|
force type of records on es currently the stupid simple script coming out of doesn t do any type forcing on the records this means that if records have different types but can be coerced long vs float for example a mapping issue could occur this issue should be closed when each record is coerced in the map of that waterfall to enforce the exact type that is valid
| 0
|
6,102
| 8,961,283,020
|
IssuesEvent
|
2019-01-28 09:13:52
|
ec-europa/europa-component-library
|
https://api.github.com/repos/ec-europa/europa-component-library
|
opened
|
[RFC] Radio button
|
RFC process: WIP
|
# Component: Radio button
## Design
Design Specs: https://webgate.ec.europa.eu/CITnet/confluence/pages/viewpage.action?spaceKey=NEXTEUROPA&title=Radio+buttons+-+Design+specifications
The specs only show 1 radio button with its associated label. IMHO the radio button only makes sense if it is part of a group, and this group has a label/heading itself.
## ECL Initial Implementation Proposal (EIIP)
...
## Sources of inspiration
- https://atlaskit.atlassian.com/packages/core/radio
- https://www.carbondesignsystem.com/components/radio-button/code
- https://getbootstrap.com/docs/4.2/components/forms/#checkboxes-and-radios
- https://www.lightningdesignsystem.com/components/radio-group/
- https://materializecss.com/radio-buttons.html
- https://material-components.github.io/material-components-web-catalog/#/component/radio
- https://styleguide.github.com/primer/components/forms/#checkboxes-and-radios
- https://baseui.design/components/radio/
- https://polaris.shopify.com/components/forms/radio-button
- https://ant.design/components/radio/
- https://developer.microsoft.com/en-us/fabric#/components/choicegroup
- https://clarity.design/documentation/radio
|
1.0
|
[RFC] Radio button - # Component: Radio button
## Design
Design Specs: https://webgate.ec.europa.eu/CITnet/confluence/pages/viewpage.action?spaceKey=NEXTEUROPA&title=Radio+buttons+-+Design+specifications
The specs only show 1 radio button with its associated label. IMHO the radio button only makes sense if it is part of a group, and this group has a label/heading itself.
## ECL Initial Implementation Proposal (EIIP)
...
## Sources of inspiration
- https://atlaskit.atlassian.com/packages/core/radio
- https://www.carbondesignsystem.com/components/radio-button/code
- https://getbootstrap.com/docs/4.2/components/forms/#checkboxes-and-radios
- https://www.lightningdesignsystem.com/components/radio-group/
- https://materializecss.com/radio-buttons.html
- https://material-components.github.io/material-components-web-catalog/#/component/radio
- https://styleguide.github.com/primer/components/forms/#checkboxes-and-radios
- https://baseui.design/components/radio/
- https://polaris.shopify.com/components/forms/radio-button
- https://ant.design/components/radio/
- https://developer.microsoft.com/en-us/fabric#/components/choicegroup
- https://clarity.design/documentation/radio
|
process
|
radio button component radio button design design specs the specs only show radio button with its associated label imho the radio button only makes sense if it is part of a group and this group has a label heading itself ecl initial implementation proposal eiip sources of inspiration
| 1
|
3,846
| 6,808,538,584
|
IssuesEvent
|
2017-11-04 04:16:42
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
reopened
|
Bad params are ignored with --file:cmd_file command line
|
status-inprocess tools-all type-bug
|
this test case should fail but does not because if one has a --file: option, the rest of the options (other than the built-ins) are ignored
73 #run_test("getTrans_by_file" "--bad --file:cmd_file")
|
1.0
|
Bad params are ignored with --file:cmd_file command line - this test case should fail but does not because if one has a --file: option, the rest of the options (other than the built-ins) are ignored
73 #run_test("getTrans_by_file" "--bad --file:cmd_file")
|
process
|
bad params are ignored with file cmd file command line this test case should fail but does not because if one has a file option the rest of the options other than the built ins are ignored run test gettrans by file bad file cmd file
| 1
|
19,455
| 10,432,890,140
|
IssuesEvent
|
2019-09-17 12:22:29
|
hyrise-mp/hyrise
|
https://api.github.com/repos/hyrise-mp/hyrise
|
closed
|
join predicate ordering rule
|
:zap: Performance
|
- [x] join predicate ordering rule
- check performance improvements for
- [x] tpch
- [x] join ordering benchmark
- [x] tpcds
- [x] tests
|
True
|
join predicate ordering rule - - [x] join predicate ordering rule
- check performance improvements for
- [x] tpch
- [x] join ordering benchmark
- [x] tpcds
- [x] tests
|
non_process
|
join predicate ordering rule join predicate ordering rule check performance improvements for tpch join ordering benchmark tpcds tests
| 0
|
2,906
| 5,889,722,880
|
IssuesEvent
|
2017-05-17 13:33:28
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
selectRow className prop. should support function types
|
help wanted inprocess
|
bgColor property of selectRow accepts a function type as well as string type.
If our requirements are much complex, it's more likely we need different classes than different background colors.
In my case, I needed selectRows to have different borders under different conditions.
So I'd request that className property also accepts a function type.
```
const selectRow = {
mode: 'radio',
className: function(row, isSelect){ ... },
}
```
[bgColor docs for reference](http://allenfang.github.io/react-bootstrap-table/docs.html#bgColor)
|
1.0
|
selectRow className prop. should support function types - bgColor property of selectRow accepts a function type as well as string type.
If our requirements are much complex, it's more likely we need different classes than different background colors.
In my case, I needed selectRows to have different borders under different conditions.
So I'd request that className property also accepts a function type.
```
const selectRow = {
mode: 'radio',
className: function(row, isSelect){ ... },
}
```
[bgColor docs for reference](http://allenfang.github.io/react-bootstrap-table/docs.html#bgColor)
|
process
|
selectrow classname prop should support function types bgcolor property of selectrow accepts a function type as well as string type if our requirements are much complex it s more likely we need different classes than different background colors in my case i needed selectrows to have different borders under different conditions so i d request that classname property also accepts a function type const selectrow mode radio classname function row isselect
| 1
|
18,908
| 24,847,005,116
|
IssuesEvent
|
2022-10-26 16:39:27
|
rladstaetter/LogoRRR
|
https://api.github.com/repos/rladstaetter/LogoRRR
|
opened
|
Update upstream dependencies and compilers
|
release process
|
LogoRRR uses a rich ecosystem which changes very quickly. In each release there are updates to various 3rd party libraries or build infrastructure which should be reflected in this issue.
|
1.0
|
Update upstream dependencies and compilers - LogoRRR uses a rich ecosystem which changes very quickly. In each release there are updates to various 3rd party libraries or build infrastructure which should be reflected in this issue.
|
process
|
update upstream dependencies and compilers logorrr uses a rich ecosystem which changes very quickly in each release there are updates to various party libraries or build infrastructure which should be reflected in this issue
| 1
|
7,109
| 10,264,293,782
|
IssuesEvent
|
2019-08-22 16:02:02
|
zooniverse/theia
|
https://api.github.com/repos/zooniverse/theia
|
opened
|
reject images that contain all water or no water
|
image processing panoptes integration
|
floating forests only cares about coastline images. detecting coastlines is hard, but maybe the answer lies in a) pixel_qa data that estimates water vapor or b) a simple neural net to look at the rgb histograms from #16
images with no water or images that are only water shouldn't be uploaded
|
1.0
|
reject images that contain all water or no water - floating forests only cares about coastline images. detecting coastlines is hard, but maybe the answer lies in a) pixel_qa data that estimates water vapor or b) a simple neural net to look at the rgb histograms from #16
images with no water or images that are only water shouldn't be uploaded
|
process
|
reject images that contain all water or no water floating forests only cares about coastline images detecting coastlines is hard but maybe the answer lies in a pixel qa data that estimates water vapor or b a simple neural net to look at the rgb histograms from images with no water or images that are only water shouldn t be uploaded
| 1
|
14,406
| 9,302,598,726
|
IssuesEvent
|
2019-03-24 11:06:46
|
daostack/arc
|
https://api.github.com/repos/daostack/arc
|
closed
|
VestingScheme : potential vulnerability
|
security
|
Avatar and Controller contracts are incorrectly expected to follow native behaviour when nativeToken() and mintTokens() is called.
|
True
|
VestingScheme : potential vulnerability - Avatar and Controller contracts are incorrectly expected to follow native behaviour when nativeToken() and mintTokens() is called.
|
non_process
|
vestingscheme potential vulnerability avatar and controller contracts are incorrectly expected to follow native behaviour when nativetoken and minttokens is called
| 0
|
32,559
| 4,776,333,978
|
IssuesEvent
|
2016-10-27 13:29:36
|
scieloorg/search-journals
|
https://api.github.com/repos/scieloorg/search-journals
|
closed
|
Quadro informativo para título do periódico
|
Discussão Melhorias OK para Testes
|
Nos resultados de busca, ao passar o mouse em cima (?) do título, exibir dados resumidos do periódico:
título, ISSNs, Publicador, Google Metrics (H5,M5), Scopus (IPP, SNIP, JCR), JCR (IF-2,IF-3)
Definir melhor esse comportamento.
|
1.0
|
Quadro informativo para título do periódico - Nos resultados de busca, ao passar o mouse em cima (?) do título, exibir dados resumidos do periódico:
título, ISSNs, Publicador, Google Metrics (H5,M5), Scopus (IPP, SNIP, JCR), JCR (IF-2,IF-3)
Definir melhor esse comportamento.
|
non_process
|
quadro informativo para título do periódico nos resultados de busca ao passar o mouse em cima do título exibir dados resumidos do periódico título issns publicador google metrics scopus ipp snip jcr jcr if if definir melhor esse comportamento
| 0
|
179,711
| 6,627,954,526
|
IssuesEvent
|
2017-09-23 11:28:38
|
openshift/origin
|
https://api.github.com/repos/openshift/origin
|
closed
|
`oc adm policy` subcommands don't have `-o` or `--dry-run`
|
area/usability component/cli kind/bug priority/P2
|
Same motivation as #14807 -- we should let people use the client to create their YAML.
/cc @deads2k @juanvallejo @fabianofranz
|
1.0
|
`oc adm policy` subcommands don't have `-o` or `--dry-run` - Same motivation as #14807 -- we should let people use the client to create their YAML.
/cc @deads2k @juanvallejo @fabianofranz
|
non_process
|
oc adm policy subcommands don t have o or dry run same motivation as we should let people use the client to create their yaml cc juanvallejo fabianofranz
| 0
|
11,723
| 14,563,269,588
|
IssuesEvent
|
2020-12-17 02:05:50
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
ELB logs problem with empty status code -
|
log-processing log/date/time format
|
Hi.
I have a problem with ELB log.
Until today, I could read ELB log, today log I have this error:
_Token '-' doesn't match specifier '%s'_
This is my logs:
```
2017-11-23T01:24:08.529348Z ELB-DOMAIN 66.249.66.206:34140 192.168.1.48:443 0.000744 0.000009 0.000012 - - 710 31845 "- - - " "-" - -
2017-11-23T01:25:15.198480Z ELB-DOMAIN 66.249.66.202:57383 192.168.1.48:80 0.00004 0.000869 0.000021 301 301 0 0 "GET http://www.DOMAIN.it:80/es/ropa/pants/ela038059712247007.html?Country=ES HTTP/1.1" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" - -
2017-11-23T01:22:14.697402Z ELB-DOMAIN 37.10.149.133:23315 192.168.1.48:443 0.000783 0.000009 0.000014 - - 22764 169121 "- - - " "-" - -
```
I use:
Log Format: %dT%t.%^ %^ %h:%^ %^ %T %^ %^ %^ %s %^ %b "%r" "%u"
Date Format %Y-%m-%d
Time Format %H:%M:%S
Thanks.
Regards.
|
1.0
|
ELB logs problem with empty status code - - Hi.
I have a problem with ELB log.
Until today, I could read ELB log, today log I have this error:
_Token '-' doesn't match specifier '%s'_
This is my logs:
```
2017-11-23T01:24:08.529348Z ELB-DOMAIN 66.249.66.206:34140 192.168.1.48:443 0.000744 0.000009 0.000012 - - 710 31845 "- - - " "-" - -
2017-11-23T01:25:15.198480Z ELB-DOMAIN 66.249.66.202:57383 192.168.1.48:80 0.00004 0.000869 0.000021 301 301 0 0 "GET http://www.DOMAIN.it:80/es/ropa/pants/ela038059712247007.html?Country=ES HTTP/1.1" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" - -
2017-11-23T01:22:14.697402Z ELB-DOMAIN 37.10.149.133:23315 192.168.1.48:443 0.000783 0.000009 0.000014 - - 22764 169121 "- - - " "-" - -
```
I use:
Log Format: %dT%t.%^ %^ %h:%^ %^ %T %^ %^ %^ %s %^ %b "%r" "%u"
Date Format %Y-%m-%d
Time Format %H:%M:%S
Thanks.
Regards.
|
process
|
elb logs problem with empty status code hi i have a problem with elb log until today i could read elb log today log i have this error token doesn t match specifier s this is my logs elb domain elb domain get http mozilla compatible googlebot elb domain i use log format dt t h t s b r u date format y m d time format h m s thanks regards
| 1
|
21,989
| 30,484,295,139
|
IssuesEvent
|
2023-07-17 23:48:35
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
skypilot 0.3.3 has 2 GuardDog issues
|
guarddog exec-base64 silent-process-execution
|
https://pypi.org/project/skypilot
https://inspector.pypi.io/project/skypilot
```{
"dependency": "skypilot",
"version": "0.3.3",
"result": {
"issues": 2,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "skypilot-0.3.3/sky/skylet/log_lib.py:219",
"code": " subprocess.Popen(\n daemon_cmd,\n start_new_session=True,\n # Suppress output\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n # Disa... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
],
"exec-base64": [
{
"location": "skypilot-0.3.3/sky/cloud_stores.py:110",
"code": " p = subprocess.run(command,\n stdout=subprocess.PIPE,\n shell=True,\n check=True,\n executable='/bin/bash')",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
]
},
"path": "/tmp/tmposz4dsa8/skypilot"
}
}```
|
1.0
|
skypilot 0.3.3 has 2 GuardDog issues - https://pypi.org/project/skypilot
https://inspector.pypi.io/project/skypilot
```{
"dependency": "skypilot",
"version": "0.3.3",
"result": {
"issues": 2,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "skypilot-0.3.3/sky/skylet/log_lib.py:219",
"code": " subprocess.Popen(\n daemon_cmd,\n start_new_session=True,\n # Suppress output\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n # Disa... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
],
"exec-base64": [
{
"location": "skypilot-0.3.3/sky/cloud_stores.py:110",
"code": " p = subprocess.run(command,\n stdout=subprocess.PIPE,\n shell=True,\n check=True,\n executable='/bin/bash')",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
]
},
"path": "/tmp/tmposz4dsa8/skypilot"
}
}```
|
process
|
skypilot has guarddog issues dependency skypilot version result issues errors results silent process execution location skypilot sky skylet log lib py code subprocess popen n daemon cmd n start new session true n suppress output n stdout subprocess devnull n stderr subprocess devnull n disa message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null exec location skypilot sky cloud stores py code p subprocess run command n stdout subprocess pipe n shell true n check true n executable bin bash message this package contains a call to the eval function with a encoded string as argument nthis is a common method used to hide a malicious payload in a module as static analysis will not decode the nstring n path tmp skypilot
| 1
|
6,870
| 10,000,436,445
|
IssuesEvent
|
2019-07-12 13:24:06
|
DataDog/integrations-core
|
https://api.github.com/repos/DataDog/integrations-core
|
closed
|
process check thresholds require ceiling
|
integration/process kind/feature-request
|
as per the [example config](https://github.com/DataDog/integrations-core/blob/master/process/datadog_checks/process/data/conf.yaml.example), the process check allows configuring thresholds based on a min/max value:
```
thresholds: (optional) Two ranges: critical and warning
warning: (optional) List of two values: If the number of processes found is below the first value
or above the second one, the process check will return WARNING.
critical: (optional) List of two values: If the number of processes found is below the first value
or above the second one, the process check will return CRITICAL.
```
I do not wish to monitor on a ceiling of processes. e.g. I do not care if the process count reaches a higher number, I only care if it reaches a lower number (like 0).
|
1.0
|
process check thresholds require ceiling - as per the [example config](https://github.com/DataDog/integrations-core/blob/master/process/datadog_checks/process/data/conf.yaml.example), the process check allows configuring thresholds based on a min/max value:
```
thresholds: (optional) Two ranges: critical and warning
warning: (optional) List of two values: If the number of processes found is below the first value
or above the second one, the process check will return WARNING.
critical: (optional) List of two values: If the number of processes found is below the first value
or above the second one, the process check will return CRITICAL.
```
I do not wish to monitor on a ceiling of processes. e.g. I do not care if the process count reaches a higher number, I only care if it reaches a lower number (like 0).
|
process
|
process check thresholds require ceiling as per the the process check allows configuring thresholds based on a min max value thresholds optional two ranges critical and warning warning optional list of two values if the number of processes found is below the first value or above the second one the process check will return warning critical optional list of two values if the number of processes found is below the first value or above the second one the process check will return critical i do not wish to monitor on a ceiling of processes e g i do not care if the process count reaches a higher number i only care if it reaches a lower number like
| 1
|
3,193
| 6,260,618,707
|
IssuesEvent
|
2017-07-14 21:07:36
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
doc: process.umask doc unclear
|
doc process question
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Subsystem**: doc
<!-- Enter your issue details below this comment. -->
https://github.com/nodejs/node/blob/199ad1d73f81c1d568232df418090e9ce3c4a7fb/doc/api/process.md#processumaskmask
The documentation for `process.umask` reads:
> The `process.umask()` method sets or returns the Node.js process's file mode creation mask. Child processes inherit the mask from the parent process. **The old mask is return if the mask argument is given, otherwise returns the current mask.**
I have emphasized the last sentence since that is the one that doesn't make any sense. Is it saying `var oldmask = process.umask()` or `var oldmask = process.umask(0)`?
|
1.0
|
doc: process.umask doc unclear - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Subsystem**: doc
<!-- Enter your issue details below this comment. -->
https://github.com/nodejs/node/blob/199ad1d73f81c1d568232df418090e9ce3c4a7fb/doc/api/process.md#processumaskmask
The documentation for `process.umask` reads:
> The `process.umask()` method sets or returns the Node.js process's file mode creation mask. Child processes inherit the mask from the parent process. **The old mask is return if the mask argument is given, otherwise returns the current mask.**
I have emphasized the last sentence since that is the one that doesn't make any sense. Is it saying `var oldmask = process.umask()` or `var oldmask = process.umask(0)`?
|
process
|
doc process umask doc unclear thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able subsystem doc the documentation for process umask reads the process umask method sets or returns the node js process s file mode creation mask child processes inherit the mask from the parent process the old mask is return if the mask argument is given otherwise returns the current mask i have emphasized the last sentence since that is the one that doesn t make any sense is it saying var oldmask process umask or var oldmask process umask
| 1
|
16,646
| 21,710,184,265
|
IssuesEvent
|
2022-05-10 13:18:14
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Migrate errors with `The underlying table for model `_prisma_migrations` does not exist.` when using a non default PostgreSQL schema
|
bug/1-unconfirmed kind/bug process/candidate topic: schema team/schema topic: postgresql
|
### Bug description
Here the project is using a non default PostgreSQL schema `error-handling-prod` (i.e. not `public`)
`npx prisma migrate dev` errors with
```
Environment variables loaded from prisma/.env
Prisma schema loaded from prisma/schema.prisma
Datasource "db": PostgreSQL database "mydb123", schema "error-handling-prod" at "localhost:5432"
PostgreSQL database mydb123 created at localhost:5432
Applying migration `20220510114355_baseline`
Error: P1014
The underlying table for model `_prisma_migrations` does not exist.
```
Screenshots from TablePlus, it seems the `_prisma_migrations` was created in the `error-handling-prod` schema but Prisma Migrate tries to find it in the `public` schema where it's missing maybe?
<img width="760" alt="Screen Shot 2022-05-10 at 15 15 51" src="https://user-images.githubusercontent.com/1328733/167637261-f3575bcb-8e33-4268-a29c-f94684066e25.png">
<img width="759" alt="Screen Shot 2022-05-10 at 15 15 44" src="https://user-images.githubusercontent.com/1328733/167637275-d90798f7-e5e0-4fc7-b452-131109f40a0a.png">
### How to reproduce
Clone
`npx prisma migrate dev`
### Expected behavior
_No response_
### Prisma information
See reproduction:
### Environment & setup
- OS: macOS
- Database: PostgreSQL
- Node.js version: 16.10
### Prisma Version
```
3.14.0-dev.60
```
|
1.0
|
Migrate errors with `The underlying table for model `_prisma_migrations` does not exist.` when using a non default PostgreSQL schema - ### Bug description
Here the project is using a non default PostgreSQL schema `error-handling-prod` (i.e. not `public`)
`npx prisma migrate dev` errors with
```
Environment variables loaded from prisma/.env
Prisma schema loaded from prisma/schema.prisma
Datasource "db": PostgreSQL database "mydb123", schema "error-handling-prod" at "localhost:5432"
PostgreSQL database mydb123 created at localhost:5432
Applying migration `20220510114355_baseline`
Error: P1014
The underlying table for model `_prisma_migrations` does not exist.
```
Screenshots from TablePlus, it seems the `_prisma_migrations` was created in the `error-handling-prod` schema but Prisma Migrate tries to find it in the `public` schema where it's missing maybe?
<img width="760" alt="Screen Shot 2022-05-10 at 15 15 51" src="https://user-images.githubusercontent.com/1328733/167637261-f3575bcb-8e33-4268-a29c-f94684066e25.png">
<img width="759" alt="Screen Shot 2022-05-10 at 15 15 44" src="https://user-images.githubusercontent.com/1328733/167637275-d90798f7-e5e0-4fc7-b452-131109f40a0a.png">
### How to reproduce
Clone
`npx prisma migrate dev`
### Expected behavior
_No response_
### Prisma information
See reproduction:
### Environment & setup
- OS: macOS
- Database: PostgreSQL
- Node.js version: 16.10
### Prisma Version
```
3.14.0-dev.60
```
|
process
|
migrate errors with the underlying table for model prisma migrations does not exist when using a non default postgresql schema bug description here the project is using a non default postgresql schema error handling prod i e not public npx prisma migrate dev errors with environment variables loaded from prisma env prisma schema loaded from prisma schema prisma datasource db postgresql database schema error handling prod at localhost postgresql database created at localhost applying migration baseline error the underlying table for model prisma migrations does not exist screenshots from tableplus it seems the prisma migrations was created in the error handling prod schema but prisma migrate tries to find it in the public schema where it s missing maybe img width alt screen shot at src img width alt screen shot at src how to reproduce clone npx prisma migrate dev expected behavior no response prisma information see reproduction environment setup os macos database postgresql node js version prisma version dev
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.