Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2,866
| 5,825,167,222
|
IssuesEvent
|
2017-05-07 19:11:47
|
brucemiller/LaTeXML
|
https://api.github.com/repos/brucemiller/LaTeXML
|
closed
|
latexmlc segmentation fault.
|
bug postprocessing
|
I want to use LaTeXML to generate publication pages. You can see the current state of my efforts at http://github.com/KWARC/bibs I have a couple of problems here, of which the segmentation fault is just the most worrying.
In the repository I have four large bibliography files and the plan for generating publication files is
1. generate XML versions of all of these files via LaTeXML
2. generate a LaTeX file with `\nocite`s for all the relevant publications by an XSLT stylesheet
3. run LaTeXML on this file, done.
So far so simple (in theory). But I get errors I would like to have help with. To reproduce them just clone the repository http://github.com/KWARC/bibs (sorry could not really whittle down the problems).
When I do `make kwarcpubs.bib.xml` (corresponding to step 1. in the plan above), I get 14 errors of the form
```
Error:unexpected:_ Script _ can only appear in math mode
Literal String \begin{bib#textrange(from=392;0,to=392;34)
In Core::Definition::Primitive[Subscript] from TeX.pool.ltxml line 3412
<= Core::Stomach[@0x7fcd366e4690] <= Core::Definition::Primitive[Begin] <= Core::Stomach[@0x7fcd366e4690] <= ...
```
only that the line in the `textrange`s do not have underscores in the bibTeX source. I am not sure what happens there. But the result looks OK visually. But that does not really help me, since
when I do `make extpubs.bib.xml` I get a fatal due to too many errors.
When I do `make mkohlhase-article.html`, I get
```
make mkohlhase-article.html
latexmlc --bibliography=kwarcpubs.bib.xml --bibliography=kwarccrossrefs.bib.xml --bibliography=extcrossrefs.bib.xml --format=html5 --destination=mkohlhase-article.html --log=mkohlhase-article.tex.ltxlog --css=bib.css mkohlhase-article.tex
No obvious problems
Wrote mkohlhase-article.html
make: *** [mkohlhase-article.html] Segmentation fault: 11
make: *** Deleting file `mkohlhase-article.html'
```
So there seems to be a problem in the postprocessing, but I cannot really track it down.
I would be grateful for any kind of fix.
|
1.0
|
latexmlc segmentation fault. - I want to use LaTeXML to generate publication pages. You can see the current state of my efforts at http://github.com/KWARC/bibs I have a couple of problems here, of which the segmentation fault is just the most worrying.
In the repository I have four large bibliography files and the plan for generating publication files is
1. generate XML versions of all of these files via LaTeXML
2. generate a LaTeX file with `\nocite`s for all the relevant publications by an XSLT stylesheet
3. run LaTeXML on this file, done.
So far so simple (in theory). But I get errors I would like to have help with. To reproduce them just clone the repository http://github.com/KWARC/bibs (sorry could not really whittle down the problems).
When I do `make kwarcpubs.bib.xml` (corresponding to step 1. in the plan above), I get 14 errors of the form
```
Error:unexpected:_ Script _ can only appear in math mode
Literal String \begin{bib#textrange(from=392;0,to=392;34)
In Core::Definition::Primitive[Subscript] from TeX.pool.ltxml line 3412
<= Core::Stomach[@0x7fcd366e4690] <= Core::Definition::Primitive[Begin] <= Core::Stomach[@0x7fcd366e4690] <= ...
```
only that the line in the `textrange`s do not have underscores in the bibTeX source. I am not sure what happens there. But the result looks OK visually. But that does not really help me, since
when I do `make extpubs.bib.xml` I get a fatal due to too many errors.
When I do `make mkohlhase-article.html`, I get
```
make mkohlhase-article.html
latexmlc --bibliography=kwarcpubs.bib.xml --bibliography=kwarccrossrefs.bib.xml --bibliography=extcrossrefs.bib.xml --format=html5 --destination=mkohlhase-article.html --log=mkohlhase-article.tex.ltxlog --css=bib.css mkohlhase-article.tex
No obvious problems
Wrote mkohlhase-article.html
make: *** [mkohlhase-article.html] Segmentation fault: 11
make: *** Deleting file `mkohlhase-article.html'
```
So there seems to be a problem in the postprocessing, but I cannot really track it down.
I would be grateful for any kind of fix.
|
process
|
latexmlc segmentation fault i want to use latexml to generate publication pages you can see the current state of my efforts at i have a couple of problems here of which the segmentation fault is just the most worrying in the repository i have four large bibliography files and the plan for generating publication files is generate xml versions of all of these files via latexml generate a latex file with nocite s for all the relevant publications by an xslt stylesheet run latexml on this file done so far so simple in theory but i get errors i would like to have help with to reproduce them just clone the repository sorry could not really whittle down the problems when i do make kwarcpubs bib xml corresponding to step in the plan above i get errors of the form error unexpected script can only appear in math mode literal string begin bib textrange from to in core definition primitive from tex pool ltxml line core stomach core definition primitive core stomach only that the line in the textrange s do not have underscores in the bibtex source i am not sure what happens there but the result looks ok visually but that does not really help me since when i do make extpubs bib xml i get a fatal due to too many errors when i do make mkohlhase article html i get make mkohlhase article html latexmlc bibliography kwarcpubs bib xml bibliography kwarccrossrefs bib xml bibliography extcrossrefs bib xml format destination mkohlhase article html log mkohlhase article tex ltxlog css bib css mkohlhase article tex no obvious problems wrote mkohlhase article html make segmentation fault make deleting file mkohlhase article html so there seems to be a problem in the postprocessing but i cannot really track it down i would be grateful for any kind of fix
| 1
|
10,808
| 13,609,288,464
|
IssuesEvent
|
2020-09-23 04:50:15
|
googleapis/java-containeranalysis
|
https://api.github.com/repos/googleapis/java-containeranalysis
|
closed
|
Dependency Dashboard
|
api: containeranalysis type: process
|
This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-containeranalysis-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-containeranalysis to v1.1.2
- [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-containeranalysis-1.x -->chore(deps): update dependency com.google.cloud:google-cloud-containeranalysis to v1.1.2
- [ ] <!-- rebase-branch=renovate/com.google.cloud-libraries-bom-10.x -->chore(deps): update dependency com.google.cloud:libraries-bom to v10
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any chore deps update dependency com google cloud google cloud containeranalysis to chore deps update dependency com google cloud libraries bom to check this box to trigger a request for renovate to run again on this repository
| 1
|
13,292
| 15,767,296,615
|
IssuesEvent
|
2021-03-31 15:56:09
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
closed
|
Update release process around version.go
|
type: process
|
We need to come up with a new strategy for updating the version. Currently this make sense for the root module but not the sub-modules. This is because the sub-modules depend an older release of `cloud.google.com/go`.
|
1.0
|
Update release process around version.go - We need to come up with a new strategy for updating the version. Currently this make sense for the root module but not the sub-modules. This is because the sub-modules depend an older release of `cloud.google.com/go`.
|
process
|
update release process around version go we need to come up with a new strategy for updating the version currently this make sense for the root module but not the sub modules this is because the sub modules depend an older release of cloud google com go
| 1
|
85,594
| 24,632,829,599
|
IssuesEvent
|
2022-10-17 04:44:31
|
yyWeng/fa22-cse110-lab3
|
https://api.github.com/repos/yyWeng/fa22-cse110-lab3
|
closed
|
[C]Style meeting minutes
|
Building
|
# What is it: Use css to style meeting minutes
#Estimated time need:1h
|
1.0
|
[C]Style meeting minutes - # What is it: Use css to style meeting minutes
#Estimated time need:1h
|
non_process
|
style meeting minutes what is it use css to style meeting minutes estimated time need
| 0
|
103,840
| 8,951,942,273
|
IssuesEvent
|
2019-01-25 15:17:55
|
club-soda/club-soda-guide
|
https://api.github.com/repos/club-soda/club-soda-guide
|
closed
|
Validation required: Venues shouldn't be able to add handle social link
|
please-test priority-3
|
As a venue adding my social links
I'd like to see an error message if I try to add only my social handle
so that I know the link will only work with a full URL
## Acceptance Criteria
- [x] Social media links aren't saved unless they contain a full URL
- [x] Adding only the handle (@claretrem) publishes an error message telling me I need to submit the full url e.g. "http://instagram.com/claretrem"
## Screenshots

|
1.0
|
Validation required: Venues shouldn't be able to add handle social link - As a venue adding my social links
I'd like to see an error message if I try to add only my social handle
so that I know the link will only work with a full URL
## Acceptance Criteria
- [x] Social media links aren't saved unless they contain a full URL
- [x] Adding only the handle (@claretrem) publishes an error message telling me I need to submit the full url e.g. "http://instagram.com/claretrem"
## Screenshots

|
non_process
|
validation required venues shouldn t be able to add handle social link as a venue adding my social links i d like to see an error message if i try to add only my social handle so that i know the link will only work with a full url acceptance criteria social media links aren t saved unless they contain a full url adding only the handle claretrem publishes an error message telling me i need to submit the full url e g screenshots
| 0
|
85,091
| 10,587,081,611
|
IssuesEvent
|
2019-10-08 21:11:54
|
crossplaneio/crossplane
|
https://api.github.com/repos/crossplaneio/crossplane
|
closed
|
Inter-resource attribute references
|
api design enhancement feature proposal real-world applications services workload
|
<!--
Thank you for helping to improve Crossplane!
Please be sure to search for open issues before raising a new one. We use issues
for bug reports and feature requests. Please find us at https://slack.crossplane.io
for questions, support, and discussion.
-->
### What problem are you facing?
<!--
Please tell us a little about your use case - it's okay if it's hypothetical!
Leading with this context helps frame the feature request so we can ensure we
implement it sensibly.
--->
Crossplane creates and monitors external resources in third party cloud providers by using managed resource types. Users can provision an external resource by submitting a claim type which results in provisioning the corresponding managed resource, which in turn provisions the corresponding external resource.
In a lot of cases for an external resource to operate properly, there needs to be some configuration performed in the related provider. For instance in order to have an `eks` cluster created in AWS, one needs to provision and configure the required network resources like `VPC` and `Subnet` (see #616 for more information).
Although each of these interconnected resources can have their own managed resource type and be created independently, there still needs to be a represention of the entire configuration as a whole. For instance if a network configuration needs one `VPC` which has one `Subnet`, not only two managed resources need to be created, but also the inter-resource relationships need to be captured to clearly represent the configuration. Let's call a set of these sort of inter-resource relationships a *resource configuration* for brevity.
Currently resource configurations are represented by shell scripts in crossplane. (See these sample [aws] and [gcp] configurations). The general approach in these scripts is to:
1. Provision external resources individually
2. Honor resource dependency, by provisioning in the right order
3. Extract the required attributes, and use those attributes to provision downstream resources
There are several issues with this approach, most notably:
1. Lack of readability
1. Heuristic approach using arbitrary third-party tools instead of Kubernetes
1. Flaky behavior as code evolves and interrelationships change
### How could Crossplane help solve your problem?
<!--
Let us know how you think Crossplane could help with your use case.
-->
To represent the resource configurations in a more readable and robust way, as a first step all external resources need to be supported to be managed by Crossplane. Then a given resource configuration can be represented in a single YAML file:
- Each external resource is represented by the corresponding managed resource type's api object
- The relationship between resources are represented by *implicit references*. Here an implicit reference means that the consuming resource refers to the required resource using its external resource attributes (rather than managed resource attributes), which hasn't been explicitly mentioned in the YAML.
For instance, resource configuration for the `VPC` example mentioned earlier will look like:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: VPC
metadata:
namespace: crossplane-system
name: my-vpc
spec:
...
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is the id of the external vpc, represented by my-vpc
# this is an implicit reference to my-vpc resource
vpcId: anArbitraryString0123
...
---
```
This approach covers tasks 1 and 3 of the scripting approach mentioned above. Resource dependency for simple configurations can be automatically achieved by continuous reconciliation. For more advanced resource dependency see #708.
There are a few things to keep in mind about implicit references:
1. The readability and cohesiveness in a resource configuration YAML is still very low. This especially becomes a problem if there were multiple managed resources of the same kind.
1. The identifying attributes of a lot of external resources are indeterministic, in that they are not known or can be specified before provisioning. Therefore a resource configuration for such resources cannot be represented by implicit references.
1. It might be desired for an administrator to make a resource configuration use an existing external resource instead of creating a new one. For instance in the above example one could imagine that `my-vpc` already exists and could be removed from the resource configuration. In such cases an implicit reference will be necessary.
To address these contradicting scenarios, I propose the notion of *reference type* as the
following:
```yaml
...
spec:
...
someParameter:
# one of {external, apiObject} values
referenceType: apiObject
# a string of attributeName.namespacedName format (apiObject case),
# or arbitrary (external case)
value: attributeNAme.sampleObject.sampleNamespace
...
...
```
This indicates that the value for `someParameter` will be retrieved from the given attribute of the referred api object. If the `referenceType` was `external`, then the `value` would be interpreted as the actual value of the external resources's `sampleParameter` attribute. Going back to our `VPC` example, now we could re-write the resource configuration as:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: VPC
metadata:
namespace: crossplane-system
name: my-vpc
spec:
...
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is an explicit reference to a resource which exists in this YAML
vpcId:
referenceType: apiObject
value: id.my-vpc.crossplane-system
...
---
```
If we would want to re-use an existing `VPC` in the above configuration, we could have:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is an implicit reference to a resource which does not exists in this YAML
vpcId:
referenceType: external
value: anArbitraryString0123
...
---
```
Advantages of using reference types:
1. High readability and reproducibility
1. Possibility of defining deterministic resource configurations, without depending on indeterministic external resources attributes
1. Possibility of using external resource attributes
[aws]: https://github.com/crossplaneio/crossplane/blob/master/cluster/examples/aws-credentials.sh
[gcp]: https://github.com/crossplaneio/crossplane/blob/master/cluster/examples/gcp-credentials.sh
|
1.0
|
Inter-resource attribute references - <!--
Thank you for helping to improve Crossplane!
Please be sure to search for open issues before raising a new one. We use issues
for bug reports and feature requests. Please find us at https://slack.crossplane.io
for questions, support, and discussion.
-->
### What problem are you facing?
<!--
Please tell us a little about your use case - it's okay if it's hypothetical!
Leading with this context helps frame the feature request so we can ensure we
implement it sensibly.
--->
Crossplane creates and monitors external resources in third party cloud providers by using managed resource types. Users can provision an external resource by submitting a claim type which results in provisioning the corresponding managed resource, which in turn provisions the corresponding external resource.
In a lot of cases for an external resource to operate properly, there needs to be some configuration performed in the related provider. For instance in order to have an `eks` cluster created in AWS, one needs to provision and configure the required network resources like `VPC` and `Subnet` (see #616 for more information).
Although each of these interconnected resources can have their own managed resource type and be created independently, there still needs to be a represention of the entire configuration as a whole. For instance if a network configuration needs one `VPC` which has one `Subnet`, not only two managed resources need to be created, but also the inter-resource relationships need to be captured to clearly represent the configuration. Let's call a set of these sort of inter-resource relationships a *resource configuration* for brevity.
Currently resource configurations are represented by shell scripts in crossplane. (See these sample [aws] and [gcp] configurations). The general approach in these scripts is to:
1. Provision external resources individually
2. Honor resource dependency, by provisioning in the right order
3. Extract the required attributes, and use those attributes to provision downstream resources
There are several issues with this approach, most notably:
1. Lack of readability
1. Heuristic approach using arbitrary third-party tools instead of Kubernetes
1. Flaky behavior as code evolves and interrelationships change
### How could Crossplane help solve your problem?
<!--
Let us know how you think Crossplane could help with your use case.
-->
To represent the resource configurations in a more readable and robust way, as a first step all external resources need to be supported to be managed by Crossplane. Then a given resource configuration can be represented in a single YAML file:
- Each external resource is represented by the corresponding managed resource type's api object
- The relationship between resources are represented by *implicit references*. Here an implicit reference means that the consuming resource refers to the required resource using its external resource attributes (rather than managed resource attributes), which hasn't been explicitly mentioned in the YAML.
For instance, resource configuration for the `VPC` example mentioned earlier will look like:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: VPC
metadata:
namespace: crossplane-system
name: my-vpc
spec:
...
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is the id of the external vpc, represented by my-vpc
# this is an implicit reference to my-vpc resource
vpcId: anArbitraryString0123
...
---
```
This approach covers tasks 1 and 3 of the scripting approach mentioned above. Resource dependency for simple configurations can be automatically achieved by continuous reconciliation. For more advanced resource dependency see #708.
There are a few things to keep in mind about implicit references:
1. The readability and cohesiveness in a resource configuration YAML is still very low. This especially becomes a problem if there were multiple managed resources of the same kind.
1. The identifying attributes of a lot of external resources are indeterministic, in that they are not known or can be specified before provisioning. Therefore a resource configuration for such resources cannot be represented by implicit references.
1. It might be desired for an administrator to make a resource configuration use an existing external resource instead of creating a new one. For instance in the above example one could imagine that `my-vpc` already exists and could be removed from the resource configuration. In such cases an implicit reference will be necessary.
To address these contradicting scenarios, I propose the notion of *reference type* as the
following:
```yaml
...
spec:
...
someParameter:
# one of {external, apiObject} values
referenceType: apiObject
# a string of attributeName.namespacedName format (apiObject case),
# or arbitrary (external case)
value: attributeNAme.sampleObject.sampleNamespace
...
...
```
This indicates that the value for `someParameter` will be retrieved from the given attribute of the referred api object. If the `referenceType` was `external`, then the `value` would be interpreted as the actual value of the external resources's `sampleParameter` attribute. Going back to our `VPC` example, now we could re-write the resource configuration as:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: VPC
metadata:
namespace: crossplane-system
name: my-vpc
spec:
...
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is an explicit reference to a resource which exists in this YAML
vpcId:
referenceType: apiObject
value: id.my-vpc.crossplane-system
...
---
```
If we would want to re-use an existing `VPC` in the above configuration, we could have:
```yaml
---
apiVersion: network.aws.crossplane.io/v1alpha1
kind: Subnet
metadata:
namespace: crossplane-system
name: my-subnet
spec:
# this is an implicit reference to a resource which does not exists in this YAML
vpcId:
referenceType: external
value: anArbitraryString0123
...
---
```
Advantages of using reference types:
1. High readability and reproducibility
1. Possibility of defining deterministic resource configurations, without depending on indeterministic external resources attributes
1. Possibility of using external resource attributes
[aws]: https://github.com/crossplaneio/crossplane/blob/master/cluster/examples/aws-credentials.sh
[gcp]: https://github.com/crossplaneio/crossplane/blob/master/cluster/examples/gcp-credentials.sh
|
non_process
|
inter resource attribute references thank you for helping to improve crossplane please be sure to search for open issues before raising a new one we use issues for bug reports and feature requests please find us at for questions support and discussion what problem are you facing please tell us a little about your use case it s okay if it s hypothetical leading with this context helps frame the feature request so we can ensure we implement it sensibly crossplane creates and monitors external resources in third party cloud providers by using managed resource types users can provision an external resource by submitting a claim type which results in provisioning the corresponding managed resource which in turn provisions the corresponding external resource in a lot of cases for an external resource to operate properly there needs to be some configuration performed in the related provider for instance in order to have an eks cluster created in aws one needs to provision and configure the required network resources like vpc and subnet see for more information although each of these interconnected resources can have their own managed resource type and be created independently there still needs to be a represention of the entire configuration as a whole for instance if a network configuration needs one vpc which has one subnet not only two managed resources need to be created but also the inter resource relationships need to be captured to clearly represent the configuration let s call a set of these sort of inter resource relationships a resource configuration for brevity currently resource configurations are represented by shell scripts in crossplane see these sample and configurations the general approach in these scripts is to provision external resources individually honor resource dependency by provisioning in the right order extract the required attributes and use those attributes to provision downstream resources there are several issues with this approach most notably lack of readability heuristic approach using arbitrary third party tools instead of kubernetes flaky behavior as code evolves and interrelationships change how could crossplane help solve your problem let us know how you think crossplane could help with your use case to represent the resource configurations in a more readable and robust way as a first step all external resources need to be supported to be managed by crossplane then a given resource configuration can be represented in a single yaml file each external resource is represented by the corresponding managed resource type s api object the relationship between resources are represented by implicit references here an implicit reference means that the consuming resource refers to the required resource using its external resource attributes rather than managed resource attributes which hasn t been explicitly mentioned in the yaml for instance resource configuration for the vpc example mentioned earlier will look like yaml apiversion network aws crossplane io kind vpc metadata namespace crossplane system name my vpc spec apiversion network aws crossplane io kind subnet metadata namespace crossplane system name my subnet spec this is the id of the external vpc represented by my vpc this is an implicit reference to my vpc resource vpcid this approach covers tasks and of the scripting approach mentioned above resource dependency for simple configurations can be automatically achieved by continuous reconciliation for more advanced resource dependency see there are a few things to keep in mind about implicit references the readability and cohesiveness in a resource configuration yaml is still very low this especially becomes a problem if there were multiple managed resources of the same kind the identifying attributes of a lot of external resources are indeterministic in that they are not known or can be specified before provisioning therefore a resource configuration for such resources cannot be represented by implicit references it might be desired for an administrator to make a resource configuration use an existing external resource instead of creating a new one for instance in the above example one could imagine that my vpc already exists and could be removed from the resource configuration in such cases an implicit reference will be necessary to address these contradicting scenarios i propose the notion of reference type as the following yaml spec someparameter one of external apiobject values referencetype apiobject a string of attributename namespacedname format apiobject case or arbitrary external case value attributename sampleobject samplenamespace this indicates that the value for someparameter will be retrieved from the given attribute of the referred api object if the referencetype was external then the value would be interpreted as the actual value of the external resources s sampleparameter attribute going back to our vpc example now we could re write the resource configuration as yaml apiversion network aws crossplane io kind vpc metadata namespace crossplane system name my vpc spec apiversion network aws crossplane io kind subnet metadata namespace crossplane system name my subnet spec this is an explicit reference to a resource which exists in this yaml vpcid referencetype apiobject value id my vpc crossplane system if we would want to re use an existing vpc in the above configuration we could have yaml apiversion network aws crossplane io kind subnet metadata namespace crossplane system name my subnet spec this is an implicit reference to a resource which does not exists in this yaml vpcid referencetype external value advantages of using reference types high readability and reproducibility possibility of defining deterministic resource configurations without depending on indeterministic external resources attributes possibility of using external resource attributes
| 0
|
62,747
| 14,656,597,685
|
IssuesEvent
|
2020-12-28 13:46:49
|
fu1771695yongxie/vue-cli
|
https://api.github.com/repos/fu1771695yongxie/vue-cli
|
opened
|
WS-2019-0332 (Medium) detected in handlebars-4.0.11.js
|
security vulnerability
|
## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.11.js</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/handlebars.js/4.0.11/handlebars.js">https://cdnjs.cloudflare.com/ajax/libs/handlebars.js/4.0.11/handlebars.js</a></p>
<p>Path to dependency file: vue-cli/packages/@vue/cli/node_modules/yaml-front-matter/docs/index.html</p>
<p>Path to vulnerable library: vue-cli/packages/@vue/cli/node_modules/yaml-front-matter/docs/js/handlebars.js,vue-cli/packages/vue-cli-version-marker/node_modules/yaml-front-matter/docs/js/handlebars.js</p>
<p>
Dependency Hierarchy:
- :x: **handlebars-4.0.11.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/vue-cli/commit/a3fffcb65f1fa89c7d439cf9944f8bfa7d616ffd">a3fffcb65f1fa89c7d439cf9944f8bfa7d616ffd</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0332 (Medium) detected in handlebars-4.0.11.js - ## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.11.js</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/handlebars.js/4.0.11/handlebars.js">https://cdnjs.cloudflare.com/ajax/libs/handlebars.js/4.0.11/handlebars.js</a></p>
<p>Path to dependency file: vue-cli/packages/@vue/cli/node_modules/yaml-front-matter/docs/index.html</p>
<p>Path to vulnerable library: vue-cli/packages/@vue/cli/node_modules/yaml-front-matter/docs/js/handlebars.js,vue-cli/packages/vue-cli-version-marker/node_modules/yaml-front-matter/docs/js/handlebars.js</p>
<p>
Dependency Hierarchy:
- :x: **handlebars-4.0.11.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/vue-cli/commit/a3fffcb65f1fa89c7d439cf9944f8bfa7d616ffd">a3fffcb65f1fa89c7d439cf9944f8bfa7d616ffd</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in handlebars js ws medium severity vulnerability vulnerable library handlebars js handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file vue cli packages vue cli node modules yaml front matter docs index html path to vulnerable library vue cli packages vue cli node modules yaml front matter docs js handlebars js vue cli packages vue cli version marker node modules yaml front matter docs js handlebars js dependency hierarchy x handlebars js vulnerable library found in head commit a href found in base branch dev vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system it is due to an incomplete fix for a ws publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
| 0
|
6,500
| 9,574,718,702
|
IssuesEvent
|
2019-05-07 03:04:39
|
ncbo/bioportal-project
|
https://api.github.com/repos/ncbo/bioportal-project
|
closed
|
ADHER_INTCARE_SP: failed to parse
|
ontology processing problem ready
|
End user reports that [ADHER_INTCARE_SP](http://bioportal.bioontology.org/ontologies/ADHER_INTCARE_SP) ontology shows "Uploaded, Error Rdf Labels" status in BioPortal. Submission 5 parsing log files contains the following stack trace:
```
E, [2019-04-27T10:44:03.310277 #2180] ERROR -- : ["Exception: Rapper cannot parse turtle file at /tmp/data_triple_store20190427-2180-ny3om3: rapper: Parsing URI file:///tmp/data_triple_store20190427-2180-ny3om3 with parser turtle
rapper: Serializing with serializer ntriples
rapper: Error - URI file:///tmp/data_triple_store20190427-2180-ny3om3:5 - syntax error at '<'
rapper: Parsing returned 4 triples
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:60:in `bnodes_filter_file'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:81:in `append_triples_no_bnodes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:122:in `append_data_triples'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:148:in `append_triples'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:632:in `generate_missing_labels_pre'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:550:in `call'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:550:in `block (2 levels) in loop_classes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:504:in `block in process_callbacks'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:500:in `delete_if'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:500:in `process_callbacks'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:549:in `block in loop_classes'
/usr/local/rbenv/versions/2.5.3/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:531:in `loop_classes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:1002:in `process_submission'
/srv/ncbo/ncbo_cron/lib/ncbo_cron/ontology_submission_parser.rb:177:in `process_submission'
```
rapper utility reports no triples errors:
```
[ncbo-deployer@ncbo-prd-app-31 5]$ rapper -i rdfxml -o ntriples owlapi.xrdf > data.triples
rapper: Parsing URI file:///srv/ncbo/share/env/production/repository/ADHER_INTCARE_SP/5/owlapi.xrdf with parser rdfxml
rapper: Serializing with serializer ntriples
rapper: Parsing returned 2377 triples
[ncbo-deployer@ncbo-prd-app-31 5]$ rapper -i ntriples -c data.triples
rapper: Parsing URI file:///srv/ncbo/share/env/production/repository/ADHER_INTCARE_SP/5/data.triples with parser ntriples
rapper: Parsing returned 2377 triples
```
|
1.0
|
ADHER_INTCARE_SP: failed to parse - End user reports that [ADHER_INTCARE_SP](http://bioportal.bioontology.org/ontologies/ADHER_INTCARE_SP) ontology shows "Uploaded, Error Rdf Labels" status in BioPortal. Submission 5 parsing log files contains the following stack trace:
```
E, [2019-04-27T10:44:03.310277 #2180] ERROR -- : ["Exception: Rapper cannot parse turtle file at /tmp/data_triple_store20190427-2180-ny3om3: rapper: Parsing URI file:///tmp/data_triple_store20190427-2180-ny3om3 with parser turtle
rapper: Serializing with serializer ntriples
rapper: Error - URI file:///tmp/data_triple_store20190427-2180-ny3om3:5 - syntax error at '<'
rapper: Parsing returned 4 triples
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:60:in `bnodes_filter_file'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:81:in `append_triples_no_bnodes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:122:in `append_data_triples'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/goo-0bc8c933f3f8/lib/goo/sparql/client.rb:148:in `append_triples'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:632:in `generate_missing_labels_pre'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:550:in `call'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:550:in `block (2 levels) in loop_classes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:504:in `block in process_callbacks'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:500:in `delete_if'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:500:in `process_callbacks'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:549:in `block in loop_classes'
/usr/local/rbenv/versions/2.5.3/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:531:in `loop_classes'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.5.0/bundler/gems/ontologies_linked_data-548e7b1e4fb8/lib/ontologies_linked_data/models/ontology_submission.rb:1002:in `process_submission'
/srv/ncbo/ncbo_cron/lib/ncbo_cron/ontology_submission_parser.rb:177:in `process_submission'
```
rapper utility reports no triples errors:
```
[ncbo-deployer@ncbo-prd-app-31 5]$ rapper -i rdfxml -o ntriples owlapi.xrdf > data.triples
rapper: Parsing URI file:///srv/ncbo/share/env/production/repository/ADHER_INTCARE_SP/5/owlapi.xrdf with parser rdfxml
rapper: Serializing with serializer ntriples
rapper: Parsing returned 2377 triples
[ncbo-deployer@ncbo-prd-app-31 5]$ rapper -i ntriples -c data.triples
rapper: Parsing URI file:///srv/ncbo/share/env/production/repository/ADHER_INTCARE_SP/5/data.triples with parser ntriples
rapper: Parsing returned 2377 triples
```
|
process
|
adher intcare sp failed to parse end user reports that ontology shows uploaded error rdf labels status in bioportal submission parsing log files contains the following stack trace e error exception rapper cannot parse turtle file at tmp data triple rapper parsing uri file tmp data triple with parser turtle rapper serializing with serializer ntriples rapper error uri file tmp data triple syntax error at rapper parsing returned triples srv ncbo ncbo cron vendor bundle ruby bundler gems goo lib goo sparql client rb in bnodes filter file srv ncbo ncbo cron vendor bundle ruby bundler gems goo lib goo sparql client rb in append triples no bnodes srv ncbo ncbo cron vendor bundle ruby bundler gems goo lib goo sparql client rb in append data triples srv ncbo ncbo cron vendor bundle ruby bundler gems goo lib goo sparql client rb in append triples srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in generate missing labels pre srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in call srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in block levels in loop classes srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in block in process callbacks srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in delete if srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in process callbacks srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in block in loop classes usr local rbenv versions lib ruby benchmark rb in realtime srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in loop classes srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in process submission srv ncbo ncbo cron lib ncbo cron ontology submission parser rb in process submission rapper utility reports no triples errors rapper i rdfxml o ntriples owlapi xrdf data triples rapper parsing uri file srv ncbo share env production repository adher intcare sp owlapi xrdf with parser rdfxml rapper serializing with serializer ntriples rapper parsing returned triples rapper i ntriples c data triples rapper parsing uri file srv ncbo share env production repository adher intcare sp data triples with parser ntriples rapper parsing returned triples
| 1
|
4,257
| 7,189,057,822
|
IssuesEvent
|
2018-02-02 12:35:57
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
closed
|
Improvements to hierarchical, adaptive, enhanced blooms (HAEB)
|
apps-blockScrape status-inprocess type-enhancement
|
blockAcct stores adaptive enhanced bloom filters, but if it is started and does not "fill up" the filter and then quits, I think it still writes the lastBloom marker. This means that, the next time it starts, it will not go back and pick up any blooms it may have created but not written previously.
Also--the adaptive enhanced blooms still get over saturated because some blocks contain hundreds of transactions and over populate the max(twiddled) value in a single block (or even a half a block). This is solved with hierarchical adaptive enhanced blooms.
|
1.0
|
Improvements to hierarchical, adaptive, enhanced blooms (HAEB) - blockAcct stores adaptive enhanced bloom filters, but if it is started and does not "fill up" the filter and then quits, I think it still writes the lastBloom marker. This means that, the next time it starts, it will not go back and pick up any blooms it may have created but not written previously.
Also--the adaptive enhanced blooms still get over saturated because some blocks contain hundreds of transactions and over populate the max(twiddled) value in a single block (or even a half a block). This is solved with hierarchical adaptive enhanced blooms.
|
process
|
improvements to hierarchical adaptive enhanced blooms haeb blockacct stores adaptive enhanced bloom filters but if it is started and does not fill up the filter and then quits i think it still writes the lastbloom marker this means that the next time it starts it will not go back and pick up any blooms it may have created but not written previously also the adaptive enhanced blooms still get over saturated because some blocks contain hundreds of transactions and over populate the max twiddled value in a single block or even a half a block this is solved with hierarchical adaptive enhanced blooms
| 1
|
733,868
| 25,326,607,932
|
IssuesEvent
|
2022-11-18 09:48:51
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
puzzles.usatoday.com - site is not usable
|
browser-firefox priority-important engine-gecko
|
<!-- @browser: Firefox 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:107.0) Gecko/20100101 Firefox/107.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/114174 -->
**URL**: https://puzzles.usatoday.com/
**Browser / Version**: Firefox 107.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
I use this website to do the USA Today crossword puzzle. I have been doing this for months using Firefox. Lately though it won't display the crossword grid anymore. I get the following message:
"Application error: a client-side exception has occurred (see the browser console for more information)."
If I use Microsoft Edge the crossword puzzle displays just fine and I'm able to type in answers and solve it. How can I fix this? I'm not even sure what the error message means.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
puzzles.usatoday.com - site is not usable - <!-- @browser: Firefox 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:107.0) Gecko/20100101 Firefox/107.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/114174 -->
**URL**: https://puzzles.usatoday.com/
**Browser / Version**: Firefox 107.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
I use this website to do the USA Today crossword puzzle. I have been doing this for months using Firefox. Lately though it won't display the crossword grid anymore. I get the following message:
"Application error: a client-side exception has occurred (see the browser console for more information)."
If I use Microsoft Edge the crossword puzzle displays just fine and I'm able to type in answers and solve it. How can I fix this? I'm not even sure what the error message means.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
puzzles usatoday com site is not usable url browser version firefox operating system windows tested another browser yes edge problem type site is not usable description page not loading correctly steps to reproduce i use this website to do the usa today crossword puzzle i have been doing this for months using firefox lately though it won t display the crossword grid anymore i get the following message application error a client side exception has occurred see the browser console for more information if i use microsoft edge the crossword puzzle displays just fine and i m able to type in answers and solve it how can i fix this i m not even sure what the error message means browser configuration none from with ❤️
| 0
|
438,550
| 30,648,031,545
|
IssuesEvent
|
2023-07-25 06:52:38
|
hyperai/tvm-cn
|
https://api.github.com/repos/hyperai/tvm-cn
|
opened
|
TVM 中文文档更新_常见问题_Task 9
|
documentation
|
【重要】[中文文档](https://tvm.hyper.ai/docs)是基于 tvm 0.10.0,对齐[英文文档 0.12 stable 版本](https://github.com/apache/tvm/tree/v0.12.0/docs),非 v0.13 dev 版本!!!
**本任务包含以下文档:**
* - [ ] [Work With microTVM](https://tvm.apache.org/docs/v0.12.0/how_to/work_with_microtvm/index.html)
* - [ ] [Extend TVM](https://tvm.apache.org/docs/v0.12.0/how_to/extend_tvm/index.html)
* - [ ] [Profile Models](https://tvm.apache.org/docs/v0.12.0/how_to/profile/index.html)
* - [ ] [Handle TVM Errors](https://tvm.apache.org/docs/v0.12.0/errors.html)
* - [ ] [Frequently Asked Questions](https://tvm.apache.org/docs/v0.12.0/faq.html)
**温馨提示:**
* 为保证翻译的准确性,每个 Task 配备 1 名翻译者及 2 名校对者,两人配合完成本部分文档的更新
* 在翻译及校对过程中,请及时在[飞书文档](https://j1tjsmktqn.feishu.cn/sheets/TDmHsE7c8hUEg3twqCpcVdDnn5e)更新进度
* 推荐借助 git diff 定位文档更新内容
|
1.0
|
TVM 中文文档更新_常见问题_Task 9 - 【重要】[中文文档](https://tvm.hyper.ai/docs)是基于 tvm 0.10.0,对齐[英文文档 0.12 stable 版本](https://github.com/apache/tvm/tree/v0.12.0/docs),非 v0.13 dev 版本!!!
**本任务包含以下文档:**
* - [ ] [Work With microTVM](https://tvm.apache.org/docs/v0.12.0/how_to/work_with_microtvm/index.html)
* - [ ] [Extend TVM](https://tvm.apache.org/docs/v0.12.0/how_to/extend_tvm/index.html)
* - [ ] [Profile Models](https://tvm.apache.org/docs/v0.12.0/how_to/profile/index.html)
* - [ ] [Handle TVM Errors](https://tvm.apache.org/docs/v0.12.0/errors.html)
* - [ ] [Frequently Asked Questions](https://tvm.apache.org/docs/v0.12.0/faq.html)
**温馨提示:**
* 为保证翻译的准确性,每个 Task 配备 1 名翻译者及 2 名校对者,两人配合完成本部分文档的更新
* 在翻译及校对过程中,请及时在[飞书文档](https://j1tjsmktqn.feishu.cn/sheets/TDmHsE7c8hUEg3twqCpcVdDnn5e)更新进度
* 推荐借助 git diff 定位文档更新内容
|
non_process
|
tvm 中文文档更新 常见问题 task 【重要】 tvm ,对齐 dev 版本!!! 本任务包含以下文档: 温馨提示: 为保证翻译的准确性,每个 task 配备 名翻译者及 名校对者,两人配合完成本部分文档的更新 在翻译及校对过程中,请及时在 推荐借助 git diff 定位文档更新内容
| 0
|
212,393
| 16,446,987,332
|
IssuesEvent
|
2021-05-20 20:52:41
|
css4j/echosvg
|
https://api.github.com/repos/css4j/echosvg
|
closed
|
WMF test uses file 'black_shapes.wmf' which references the 'Courier' font
|
bug tests
|
One of the WMF tests uses the 'black_shapes.wmf' file which references the 'Courier' font. That font is not free and is not installed by default by any common operating system.
As a workaround, the test should be skipped if the font is not installed.
|
1.0
|
WMF test uses file 'black_shapes.wmf' which references the 'Courier' font - One of the WMF tests uses the 'black_shapes.wmf' file which references the 'Courier' font. That font is not free and is not installed by default by any common operating system.
As a workaround, the test should be skipped if the font is not installed.
|
non_process
|
wmf test uses file black shapes wmf which references the courier font one of the wmf tests uses the black shapes wmf file which references the courier font that font is not free and is not installed by default by any common operating system as a workaround the test should be skipped if the font is not installed
| 0
|
8,728
| 11,863,075,789
|
IssuesEvent
|
2020-03-25 19:02:37
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Error after following tutorial: AzureClassicRunAsConnection
|
Pri1 automation/svc process-automation/subsvc
|
When I test the "ScheduledStartStop_Parent" Runbook after following this tutorial, I get the following error:
```
{[AzureChinaCloud, AzureChinaCloud], [AzureCloud, AzureCloud], [AzureGermanCloud, AzureGermanCloud], [AzureUSGovernme...
Successfully logged into Azure subscription using ARM cmdlets...
Logging into Azure subscription using Classic cmdlets...
Authenticating Classic RunAs account
Get-AutomationConnection : Connections asset not found. To create this Connections asset, navigate to the Assets blade
and create a Connections asset named: AzureClassicRunAsConnection.
At line:183 char:27
+ ... $connection = Get-AutomationConnection -Name $connectionAssetName
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (:) [Get-AutomationConnection], AssetManagementClientException
+ FullyQualifiedErrorId : 3,Orchestrator.AssetManagement.Cmdlets.GetAutomationConnectionCmdlet
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 225c9d05-83dd-b006-0025-3753f5ab25bf
* Version Independent ID: 9eecef0c-b1cb-1136-faf7-542214492096
* Content: [Start/Stop VMs during off-hours solution](https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#viewing-the-solution)
* Content Source: [articles/automation/automation-solution-vm-management.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-solution-vm-management.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
1.0
|
Error after following tutorial: AzureClassicRunAsConnection - When I test the "ScheduledStartStop_Parent" Runbook after following this tutorial, I get the following error:
```
{[AzureChinaCloud, AzureChinaCloud], [AzureCloud, AzureCloud], [AzureGermanCloud, AzureGermanCloud], [AzureUSGovernme...
Successfully logged into Azure subscription using ARM cmdlets...
Logging into Azure subscription using Classic cmdlets...
Authenticating Classic RunAs account
Get-AutomationConnection : Connections asset not found. To create this Connections asset, navigate to the Assets blade
and create a Connections asset named: AzureClassicRunAsConnection.
At line:183 char:27
+ ... $connection = Get-AutomationConnection -Name $connectionAssetName
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (:) [Get-AutomationConnection], AssetManagementClientException
+ FullyQualifiedErrorId : 3,Orchestrator.AssetManagement.Cmdlets.GetAutomationConnectionCmdlet
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 225c9d05-83dd-b006-0025-3753f5ab25bf
* Version Independent ID: 9eecef0c-b1cb-1136-faf7-542214492096
* Content: [Start/Stop VMs during off-hours solution](https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#viewing-the-solution)
* Content Source: [articles/automation/automation-solution-vm-management.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-solution-vm-management.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
process
|
error after following tutorial azureclassicrunasconnection when i test the scheduledstartstop parent runbook after following this tutorial i get the following error azureusgovernme successfully logged into azure subscription using arm cmdlets logging into azure subscription using classic cmdlets authenticating classic runas account get automationconnection connections asset not found to create this connections asset navigate to the assets blade and create a connections asset named azureclassicrunasconnection at line char connection get automationconnection name connectionassetname categoryinfo objectnotfound assetmanagementclientexception fullyqualifiederrorid orchestrator assetmanagement cmdlets getautomationconnectioncmdlet document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login mgoedtel microsoft alias magoedte
| 1
|
12,327
| 3,601,205,667
|
IssuesEvent
|
2016-02-03 10:19:48
|
coreos/rkt
|
https://api.github.com/repos/coreos/rkt
|
closed
|
docs: mention and explain provided systemd units
|
area/distribution help wanted kind/documentation kind/enhancement low hanging fruit priority/P2
|
We should document the provided systemd units that live in https://github.com/coreos/rkt/tree/master/dist/init/systemd
- [ ] metadata
- [ ] gc
Should we also track and explain how distributions package these units or do we leave this to the distro's docs?
|
1.0
|
docs: mention and explain provided systemd units - We should document the provided systemd units that live in https://github.com/coreos/rkt/tree/master/dist/init/systemd
- [ ] metadata
- [ ] gc
Should we also track and explain how distributions package these units or do we leave this to the distro's docs?
|
non_process
|
docs mention and explain provided systemd units we should document the provided systemd units that live in metadata gc should we also track and explain how distributions package these units or do we leave this to the distro s docs
| 0
|
17,882
| 23,836,106,430
|
IssuesEvent
|
2022-09-06 06:05:52
|
ppy/osu-web
|
https://api.github.com/repos/ppy/osu-web
|
closed
|
Add newly ranked beatmaps to `bss_process_queue`
|
area:beatmap-processing priority:0
|
Currently, the osu!(lazer) client is made aware of any changes in beatmaps statuses on startup. This happens via a mechanism that tracks `bss_process_queue`.
Right now, this doesn't correctly handle changes in ranked status (from qualified to ranked) as entries are not added at this boundary (tracking at https://github.com/ppy/osu/issues/19488). The easiest solution is to insert an entry during the qualified -> ranked operation.
```sql
INSERT INTO bss_process_queue (beatmapset_id) VALUES (1520317);
```
Default values for the table work fine.
|
1.0
|
Add newly ranked beatmaps to `bss_process_queue` - Currently, the osu!(lazer) client is made aware of any changes in beatmaps statuses on startup. This happens via a mechanism that tracks `bss_process_queue`.
Right now, this doesn't correctly handle changes in ranked status (from qualified to ranked) as entries are not added at this boundary (tracking at https://github.com/ppy/osu/issues/19488). The easiest solution is to insert an entry during the qualified -> ranked operation.
```sql
INSERT INTO bss_process_queue (beatmapset_id) VALUES (1520317);
```
Default values for the table work fine.
|
process
|
add newly ranked beatmaps to bss process queue currently the osu lazer client is made aware of any changes in beatmaps statuses on startup this happens via a mechanism that tracks bss process queue right now this doesn t correctly handle changes in ranked status from qualified to ranked as entries are not added at this boundary tracking at the easiest solution is to insert an entry during the qualified ranked operation sql insert into bss process queue beatmapset id values default values for the table work fine
| 1
|
199,109
| 15,736,321,640
|
IssuesEvent
|
2021-03-30 00:22:01
|
honeybadger-io/honeybadger-crystal
|
https://api.github.com/repos/honeybadger-io/honeybadger-crystal
|
opened
|
README: Add a configuration section
|
documentation help wanted
|
See the [Python README](https://github.com/honeybadger-io/honeybadger-python#configuration) for an example. We should have a similar section for Crystal that shows how to use `Honeybadger.configure` and provides a table of available configuration options (feel free to copy/paste and edit from the Python README).
|
1.0
|
README: Add a configuration section - See the [Python README](https://github.com/honeybadger-io/honeybadger-python#configuration) for an example. We should have a similar section for Crystal that shows how to use `Honeybadger.configure` and provides a table of available configuration options (feel free to copy/paste and edit from the Python README).
|
non_process
|
readme add a configuration section see the for an example we should have a similar section for crystal that shows how to use honeybadger configure and provides a table of available configuration options feel free to copy paste and edit from the python readme
| 0
|
652,025
| 21,518,577,967
|
IssuesEvent
|
2022-04-28 12:19:41
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.google.com - site is not usable
|
priority-critical browser-fixme
|
<!-- @browser: Waterfox 91.8.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101 Firefox/91.0 Waterfox/91.8.0 -->
<!-- @reported_with: desktop-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/103383 -->
**URL**: https://www.google.com/search?q=tiktok
**Browser / Version**: Waterfox 91.8.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
waterfox is not going to website, also not going anywhere past google
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/4/ae1f9d2a-a3d7-43cc-85ae-7319a001cafe.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220420125857</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/4/da7c053c-8d1e-479f-bc6f-d08e45eb884f)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.google.com - site is not usable - <!-- @browser: Waterfox 91.8.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101 Firefox/91.0 Waterfox/91.8.0 -->
<!-- @reported_with: desktop-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/103383 -->
**URL**: https://www.google.com/search?q=tiktok
**Browser / Version**: Waterfox 91.8.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
waterfox is not going to website, also not going anywhere past google
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/4/ae1f9d2a-a3d7-43cc-85ae-7319a001cafe.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220420125857</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/4/da7c053c-8d1e-479f-bc6f-d08e45eb884f)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
site is not usable url browser version waterfox operating system windows tested another browser yes chrome problem type site is not usable description page not loading correctly steps to reproduce waterfox is not going to website also not going anywhere past google view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel default hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
| 0
|
20,924
| 27,768,453,027
|
IssuesEvent
|
2023-03-16 13:02:57
|
Ridanisaurus/EmendatusEnigmatica
|
https://api.github.com/repos/Ridanisaurus/EmendatusEnigmatica
|
closed
|
[Mekanism] Infuse Types
|
undergoing development processed type request addon request
|
### Describe the processed type you'd like
Mekanism adds additional type of the material called "Infuse Type", that is used in Metallurgic Infuser for creation of Alloys. They also have "Compressed Version" of the Infuse Type as an item, made in Enrichment Chamber.
Pretty much the basic recipes would be Conversion Recipe from Dust / Gem / Compressed Version of the Infuse Type, to the infusion Type, and from Dust / Gem to the Compressed Varian in enrichment Chamber 😄
### Additional context
Could be really useful for making custom Alloys with Mekanism, or just adding additional ways of creating Metal Alloys 😄
|
1.0
|
[Mekanism] Infuse Types - ### Describe the processed type you'd like
Mekanism adds additional type of the material called "Infuse Type", that is used in Metallurgic Infuser for creation of Alloys. They also have "Compressed Version" of the Infuse Type as an item, made in Enrichment Chamber.
Pretty much the basic recipes would be Conversion Recipe from Dust / Gem / Compressed Version of the Infuse Type, to the infusion Type, and from Dust / Gem to the Compressed Varian in enrichment Chamber 😄
### Additional context
Could be really useful for making custom Alloys with Mekanism, or just adding additional ways of creating Metal Alloys 😄
|
process
|
infuse types describe the processed type you d like mekanism adds additional type of the material called infuse type that is used in metallurgic infuser for creation of alloys they also have compressed version of the infuse type as an item made in enrichment chamber pretty much the basic recipes would be conversion recipe from dust gem compressed version of the infuse type to the infusion type and from dust gem to the compressed varian in enrichment chamber 😄 additional context could be really useful for making custom alloys with mekanism or just adding additional ways of creating metal alloys 😄
| 1
|
63,764
| 26,511,126,729
|
IssuesEvent
|
2023-01-18 17:10:21
|
BCDevOps/developer-experience
|
https://api.github.com/repos/BCDevOps/developer-experience
|
closed
|
Make a Vault space for the Cloud Pathfinder Team
|
vault ops and shared services
|
**Describe the issue**
CPF wants a space for secrets on Vault, same as ours.
**Definition of done**
CPF has a Vault space and their whole team has access to it.
|
1.0
|
Make a Vault space for the Cloud Pathfinder Team - **Describe the issue**
CPF wants a space for secrets on Vault, same as ours.
**Definition of done**
CPF has a Vault space and their whole team has access to it.
|
non_process
|
make a vault space for the cloud pathfinder team describe the issue cpf wants a space for secrets on vault same as ours definition of done cpf has a vault space and their whole team has access to it
| 0
|
10,395
| 13,198,110,485
|
IssuesEvent
|
2020-08-14 01:21:44
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
reopened
|
Can the GetProcessesByName method reduce the number of arrays and Process objects created?
|
area-System.Diagnostics.Process tenet-performance untriaged
|
<!--This is just a template - feel free to delete any and all of it and replace as appropriate.-->
### Description
As written in the code, GetProcessesByName first calls GetProcesses to obtain all processes of the machine, and then filters the process name
```
public static Process[] GetProcessesByName(string? processName, string machineName)
{
if (processName == null)
{
processName = string.Empty;
}
Process[] procs = GetProcesses(machineName);
var list = new List<Process>();
for (int i = 0; i < procs.Length; i++)
{
if (string.Equals(processName, procs[i].ProcessName, StringComparison.OrdinalIgnoreCase))
{
list.Add(procs[i]);
}
else
{
procs[i].Dispose();
}
}
return list.ToArray();
}
```
But in fact, we can filter the process name in GetProcesses in advance.
```
public static Process[] GetProcesses(string machineName)
{
bool isRemoteMachine = ProcessManager.IsRemoteMachine(machineName);
ProcessInfo[] processInfos = ProcessManager.GetProcessInfos(machineName);
Process[] processes = new Process[processInfos.Length];
for (int i = 0; i < processInfos.Length; i++)
{
ProcessInfo processInfo = processInfos[i];
processes[i] = new Process(machineName, isRemoteMachine, processInfo.ProcessId, processInfo);
}
return processes;
}
```
If we do this, we can reduce the allocation length of the Process array and create some Process objects in the GetProcesses method
|
1.0
|
Can the GetProcessesByName method reduce the number of arrays and Process objects created? - <!--This is just a template - feel free to delete any and all of it and replace as appropriate.-->
### Description
As written in the code, GetProcessesByName first calls GetProcesses to obtain all processes of the machine, and then filters the process name
```
public static Process[] GetProcessesByName(string? processName, string machineName)
{
if (processName == null)
{
processName = string.Empty;
}
Process[] procs = GetProcesses(machineName);
var list = new List<Process>();
for (int i = 0; i < procs.Length; i++)
{
if (string.Equals(processName, procs[i].ProcessName, StringComparison.OrdinalIgnoreCase))
{
list.Add(procs[i]);
}
else
{
procs[i].Dispose();
}
}
return list.ToArray();
}
```
But in fact, we can filter the process name in GetProcesses in advance.
```
public static Process[] GetProcesses(string machineName)
{
bool isRemoteMachine = ProcessManager.IsRemoteMachine(machineName);
ProcessInfo[] processInfos = ProcessManager.GetProcessInfos(machineName);
Process[] processes = new Process[processInfos.Length];
for (int i = 0; i < processInfos.Length; i++)
{
ProcessInfo processInfo = processInfos[i];
processes[i] = new Process(machineName, isRemoteMachine, processInfo.ProcessId, processInfo);
}
return processes;
}
```
If we do this, we can reduce the allocation length of the Process array and create some Process objects in the GetProcesses method
|
process
|
can the getprocessesbyname method reduce the number of arrays and process objects created description as written in the code getprocessesbyname first calls getprocesses to obtain all processes of the machine and then filters the process name public static process getprocessesbyname string processname string machinename if processname null processname string empty process procs getprocesses machinename var list new list for int i i procs length i if string equals processname procs processname stringcomparison ordinalignorecase list add procs else procs dispose return list toarray but in fact we can filter the process name in getprocesses in advance public static process getprocesses string machinename bool isremotemachine processmanager isremotemachine machinename processinfo processinfos processmanager getprocessinfos machinename process processes new process for int i i processinfos length i processinfo processinfo processinfos processes new process machinename isremotemachine processinfo processid processinfo return processes if we do this we can reduce the allocation length of the process array and create some process objects in the getprocesses method
| 1
|
21,448
| 29,481,457,810
|
IssuesEvent
|
2023-06-02 06:10:19
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] Implement/expose functions for join strategy (type)
|
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
To power this UI:
<img width="216" alt="image" src="https://github.com/metabase/metabase/assets/1455846/8b45c394-e4e7-4536-af15-90b00f9a224a">
As far as I can tell, MLv2 currently does not support the join `:strategy` at all; it's not in `metabase.lib.schema.join` at any rate. The definition in `metabase.mbql.schema` is as follows:
```clj
(def join-strategies
"Valid values of the `:strategy` key in a join map."
#{:left-join :right-join :inner-join :full-join})
```
I think we need the following functions:
```clj
(join-strategy query stage-number join) => strategy keyword
(with-join-strategy query stage-number join new-strategy) => join
(available-join-strategies query stage-number) => [strategy keyword]
```
I think the flow for changing the strategy would be to get the join you want to manipulate, change the strategy with `with-join-strategy`, then use `replace-clause` to splice it back into the query and replace the old join.
`join-strategy` should return `:left-join` (the default strategy) if the join does not have an explicit `:strategy` specified.
`available-join-strategies` should look at the `:features` list for the current database and only return join types supported by the current database.
|
1.0
|
[MLv2] Implement/expose functions for join strategy (type) - To power this UI:
<img width="216" alt="image" src="https://github.com/metabase/metabase/assets/1455846/8b45c394-e4e7-4536-af15-90b00f9a224a">
As far as I can tell, MLv2 currently does not support the join `:strategy` at all; it's not in `metabase.lib.schema.join` at any rate. The definition in `metabase.mbql.schema` is as follows:
```clj
(def join-strategies
"Valid values of the `:strategy` key in a join map."
#{:left-join :right-join :inner-join :full-join})
```
I think we need the following functions:
```clj
(join-strategy query stage-number join) => strategy keyword
(with-join-strategy query stage-number join new-strategy) => join
(available-join-strategies query stage-number) => [strategy keyword]
```
I think the flow for changing the strategy would be to get the join you want to manipulate, change the strategy with `with-join-strategy`, then use `replace-clause` to splice it back into the query and replace the old join.
`join-strategy` should return `:left-join` (the default strategy) if the join does not have an explicit `:strategy` specified.
`available-join-strategies` should look at the `:features` list for the current database and only return join types supported by the current database.
|
process
|
implement expose functions for join strategy type to power this ui img width alt image src as far as i can tell currently does not support the join strategy at all it s not in metabase lib schema join at any rate the definition in metabase mbql schema is as follows clj def join strategies valid values of the strategy key in a join map left join right join inner join full join i think we need the following functions clj join strategy query stage number join strategy keyword with join strategy query stage number join new strategy join available join strategies query stage number i think the flow for changing the strategy would be to get the join you want to manipulate change the strategy with with join strategy then use replace clause to splice it back into the query and replace the old join join strategy should return left join the default strategy if the join does not have an explicit strategy specified available join strategies should look at the features list for the current database and only return join types supported by the current database
| 1
|
795,530
| 28,076,142,190
|
IssuesEvent
|
2023-03-29 23:55:03
|
GoogleCloudPlatform/java-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/java-docs-samples
|
closed
|
com.example.bigtable.HelloWorldTest: helloWorld failed
|
type: bug priority: p2 api: bigtable samples flakybot: issue flakybot: flaky
|
This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: d9bcb2b6cac5c144c3e37c3391da24cc970701f9
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3be84226-9281-462d-a0f6-e72ca70a9e5e), [Sponge](http://sponge2/3be84226-9281-462d-a0f6-e72ca70a9e5e)
status: failed
<details><summary>Test output</summary><br><pre>expected to contain:
HelloWorld: Write some greetings to the table
but was:
HelloWorld: Create table Hello-Bigtable-18627fe4-3741-41ce-
HelloWorld: Cleaning up table
at com.example.bigtable.HelloWorldTest.helloWorld(HelloWorldTest.java:51)
</pre></details>
|
1.0
|
com.example.bigtable.HelloWorldTest: helloWorld failed - This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: d9bcb2b6cac5c144c3e37c3391da24cc970701f9
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/3be84226-9281-462d-a0f6-e72ca70a9e5e), [Sponge](http://sponge2/3be84226-9281-462d-a0f6-e72ca70a9e5e)
status: failed
<details><summary>Test output</summary><br><pre>expected to contain:
HelloWorld: Write some greetings to the table
but was:
HelloWorld: Create table Hello-Bigtable-18627fe4-3741-41ce-
HelloWorld: Cleaning up table
at com.example.bigtable.HelloWorldTest.helloWorld(HelloWorldTest.java:51)
</pre></details>
|
non_process
|
com example bigtable helloworldtest helloworld failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output expected to contain helloworld write some greetings to the table but was helloworld create table hello bigtable helloworld cleaning up table at com example bigtable helloworldtest helloworld helloworldtest java
| 0
|
22,320
| 30,884,301,747
|
IssuesEvent
|
2023-08-03 20:16:47
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
opened
|
Upgrade chart dependencies
|
enhancement process
|
### Problem
The chart dependencies are out of date.
### Solution
* Update the chart dependencies
* Update any image tags
* Test with Kubernetes 1.27
### Alternatives
_No response_
|
1.0
|
Upgrade chart dependencies - ### Problem
The chart dependencies are out of date.
### Solution
* Update the chart dependencies
* Update any image tags
* Test with Kubernetes 1.27
### Alternatives
_No response_
|
process
|
upgrade chart dependencies problem the chart dependencies are out of date solution update the chart dependencies update any image tags test with kubernetes alternatives no response
| 1
|
19,860
| 26,270,999,187
|
IssuesEvent
|
2023-01-06 16:58:57
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
closed
|
Disable helm release health check by default
|
bug regression process
|
### Description
The new Helm release health check will mark the load balancer as unready while the upgrade is in progress. This is ideal behavior for multi-cluster environments but for single cluster environments it stops it from routing. Since we have more single cluster environments we should change the default to false and make it opt in behavior.
### Steps to reproduce
Run a helm upgrade via Flux
### Additional context
_No response_
### Hedera network
previewnet
### Version
v0.72.0-rc1
### Operating system
None
|
1.0
|
Disable helm release health check by default - ### Description
The new Helm release health check will mark the load balancer as unready while the upgrade is in progress. This is ideal behavior for multi-cluster environments but for single cluster environments it stops it from routing. Since we have more single cluster environments we should change the default to false and make it opt in behavior.
### Steps to reproduce
Run a helm upgrade via Flux
### Additional context
_No response_
### Hedera network
previewnet
### Version
v0.72.0-rc1
### Operating system
None
|
process
|
disable helm release health check by default description the new helm release health check will mark the load balancer as unready while the upgrade is in progress this is ideal behavior for multi cluster environments but for single cluster environments it stops it from routing since we have more single cluster environments we should change the default to false and make it opt in behavior steps to reproduce run a helm upgrade via flux additional context no response hedera network previewnet version operating system none
| 1
|
13,894
| 16,656,020,715
|
IssuesEvent
|
2021-06-05 14:40:46
|
NationalSecurityAgency/ghidra
|
https://api.github.com/repos/NationalSecurityAgency/ghidra
|
closed
|
ia.sinc : incorrect sleigh for rdmsr
|
Feature: Processor/x86
|
**Describe the bug**
ia.sinc specifies the rdmsr operation as
<!--StartFragment-->define pcodeop rdmsr;:RDMSR is vexMode=0 & byte=0xf; byte=0x32 { tmp:8 = rdmsr(ECX); EDX = tmp(4); EAX = tmp(0); }<!--EndFragment-->
The high 32 bits of RAX and RDX should be cleared on 64-bit processors.
|
1.0
|
ia.sinc : incorrect sleigh for rdmsr - **Describe the bug**
ia.sinc specifies the rdmsr operation as
<!--StartFragment-->define pcodeop rdmsr;:RDMSR is vexMode=0 & byte=0xf; byte=0x32 { tmp:8 = rdmsr(ECX); EDX = tmp(4); EAX = tmp(0); }<!--EndFragment-->
The high 32 bits of RAX and RDX should be cleared on 64-bit processors.
|
process
|
ia sinc incorrect sleigh for rdmsr describe the bug ia sinc specifies the rdmsr operation as define pcodeop rdmsr rdmsr is vexmode amp byte byte tmp rdmsr ecx edx tmp eax tmp the high bits of rax and rdx should be cleared on bit processors
| 1
|
11,776
| 14,611,600,377
|
IssuesEvent
|
2020-12-22 03:47:04
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
QgsProcessingParameterFolderDestination default value issue in PointsToPath in the modeler
|
Bug Feedback Processing Regression
|
This is a regression spotted in 3.16.1 that was not present in 3.16.0.
When running PointstoPath in the modeler the output_text_dir parameter has a default value that is the temporary output directoy + "OUTPUT_TEXT_DIR", I have no clue why the name of the parameter is added to the path of the temporary directory but now this ensure that the algorithm always fails due to the fact that the directory does not exist.
See https://github.com/qgis/QGIS/blob/ae565f180ad5c8a83b86df2ac7cd85df95b4c96d/python/plugins/processing/algs/qgis/PointsToPaths.py#L113 where a value magically appears in the modeler.
Will provide a bugfix as a temporary measure for this algorithm,
|
1.0
|
QgsProcessingParameterFolderDestination default value issue in PointsToPath in the modeler - This is a regression spotted in 3.16.1 that was not present in 3.16.0.
When running PointstoPath in the modeler the output_text_dir parameter has a default value that is the temporary output directoy + "OUTPUT_TEXT_DIR", I have no clue why the name of the parameter is added to the path of the temporary directory but now this ensure that the algorithm always fails due to the fact that the directory does not exist.
See https://github.com/qgis/QGIS/blob/ae565f180ad5c8a83b86df2ac7cd85df95b4c96d/python/plugins/processing/algs/qgis/PointsToPaths.py#L113 where a value magically appears in the modeler.
Will provide a bugfix as a temporary measure for this algorithm,
|
process
|
qgsprocessingparameterfolderdestination default value issue in pointstopath in the modeler this is a regression spotted in that was not present in when running pointstopath in the modeler the output text dir parameter has a default value that is the temporary output directoy output text dir i have no clue why the name of the parameter is added to the path of the temporary directory but now this ensure that the algorithm always fails due to the fact that the directory does not exist see where a value magically appears in the modeler will provide a bugfix as a temporary measure for this algorithm
| 1
|
6,482
| 9,553,704,570
|
IssuesEvent
|
2019-05-02 19:59:35
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
[arm32/Windows] System.Diagnostics.PerformanceCounter.Tests failures on arm32 windows
|
arch-arm32 area-System.Diagnostics.Process os-windows-iot test-run-core
|
Running the arm32 tests on windows for System.Diagnostics.PerformanceCounter.Tests produced the following results: passed 41/93 tests.
Most of the 52 failures are due to a Win32Exception happening like the following:
```
System.AggregateException : One or more errors occurred. (The system cannot find the file specified.) (The following constructor parameters did not have matching fixture data: PerformanceDataTestsFixture fixture)\r\n---- System.ComponentModel.Win32Exception : The system cannot find the file specified.\r\n---- The following constructor parameters did not have matching fixture data: PerformanceDataTestsFixture fixture
----- Inner Stack Trace #1 (System.ComponentModel.Win32Exception) -----
at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.Windows.cs:line 598
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 24
at System.Diagnostics.Process.Start() in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1216
at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1259
at System.Diagnostics.Tests.PerformanceDataTestsFixture.RegisterCounters() in F:\git\corefx\src\System.Diagnostics.PerformanceCounter\tests\PerformanceDataTests.cs:line 219
at System.Diagnostics.Tests.PerformanceDataTestsFixture..ctor() in F:\git\corefx\src\System.Diagnostics.PerformanceCounter\tests\PerformanceDataTests.cs:line 244
----- Inner Stack Trace #2 (Xunit.Sdk.TestClassException) -----
```
[testResults.zip](https://github.com/dotnet/corefx/files/2669876/testResults.zip)
|
1.0
|
[arm32/Windows] System.Diagnostics.PerformanceCounter.Tests failures on arm32 windows - Running the arm32 tests on windows for System.Diagnostics.PerformanceCounter.Tests produced the following results: passed 41/93 tests.
Most of the 52 failures are due to a Win32Exception happening like the following:
```
System.AggregateException : One or more errors occurred. (The system cannot find the file specified.) (The following constructor parameters did not have matching fixture data: PerformanceDataTestsFixture fixture)\r\n---- System.ComponentModel.Win32Exception : The system cannot find the file specified.\r\n---- The following constructor parameters did not have matching fixture data: PerformanceDataTestsFixture fixture
----- Inner Stack Trace #1 (System.ComponentModel.Win32Exception) -----
at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.Windows.cs:line 598
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.Win32.cs:line 24
at System.Diagnostics.Process.Start() in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1216
at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) in F:\git\corefx\src\System.Diagnostics.Process\src\System\Diagnostics\Process.cs:line 1259
at System.Diagnostics.Tests.PerformanceDataTestsFixture.RegisterCounters() in F:\git\corefx\src\System.Diagnostics.PerformanceCounter\tests\PerformanceDataTests.cs:line 219
at System.Diagnostics.Tests.PerformanceDataTestsFixture..ctor() in F:\git\corefx\src\System.Diagnostics.PerformanceCounter\tests\PerformanceDataTests.cs:line 244
----- Inner Stack Trace #2 (Xunit.Sdk.TestClassException) -----
```
[testResults.zip](https://github.com/dotnet/corefx/files/2669876/testResults.zip)
|
process
|
system diagnostics performancecounter tests failures on windows running the tests on windows for system diagnostics performancecounter tests produced the following results passed tests most of the failures are due to a happening like the following system aggregateexception one or more errors occurred the system cannot find the file specified the following constructor parameters did not have matching fixture data performancedatatestsfixture fixture r n system componentmodel the system cannot find the file specified r n the following constructor parameters did not have matching fixture data performancedatatestsfixture fixture inner stack trace system componentmodel at system diagnostics process startwithcreateprocess processstartinfo startinfo in f git corefx src system diagnostics process src system diagnostics process windows cs line at system diagnostics process startcore processstartinfo startinfo in f git corefx src system diagnostics process src system diagnostics process cs line at system diagnostics process start in f git corefx src system diagnostics process src system diagnostics process cs line at system diagnostics process start processstartinfo startinfo in f git corefx src system diagnostics process src system diagnostics process cs line at system diagnostics tests performancedatatestsfixture registercounters in f git corefx src system diagnostics performancecounter tests performancedatatests cs line at system diagnostics tests performancedatatestsfixture ctor in f git corefx src system diagnostics performancecounter tests performancedatatests cs line inner stack trace xunit sdk testclassexception
| 1
|
118
| 2,550,079,643
|
IssuesEvent
|
2015-02-01 03:38:53
|
dalehenrich/filetree
|
https://api.github.com/repos/dalehenrich/filetree
|
closed
|
category property for class and package membership can become out of sync
|
in process
|
While doing a merge for a "renamed" package, I made the mistake of moving a class properties file (with the category set to the old package name) into the new package and when filetree creates the class in the other category, the class is effectively moved to a different package ... since the category pretty much needs to stay in sync with the package, I think an error should be thrown if a class is loaded whose category is not consistent with the package that the class resides within ...
|
1.0
|
category property for class and package membership can become out of sync - While doing a merge for a "renamed" package, I made the mistake of moving a class properties file (with the category set to the old package name) into the new package and when filetree creates the class in the other category, the class is effectively moved to a different package ... since the category pretty much needs to stay in sync with the package, I think an error should be thrown if a class is loaded whose category is not consistent with the package that the class resides within ...
|
process
|
category property for class and package membership can become out of sync while doing a merge for a renamed package i made the mistake of moving a class properties file with the category set to the old package name into the new package and when filetree creates the class in the other category the class is effectively moved to a different package since the category pretty much needs to stay in sync with the package i think an error should be thrown if a class is loaded whose category is not consistent with the package that the class resides within
| 1
|
7,984
| 11,170,752,603
|
IssuesEvent
|
2019-12-28 15:11:22
|
bisq-network/bisq
|
https://api.github.com/repos/bisq-network/bisq
|
closed
|
Error Publishing Transaction
|
an:investigation in:trade-process was:dropped
|
Using BISQ v1.1.5
Ubuntu 16.04
When trying to fund a transaction with BTC received the following error:
An error occurred when taking the offer.
An error occurred at task: TakerPublishFeeTx
Exception message: We got an onFailure from the peerGroup.broadcastTransaction callback.
Please try to restart your application and check your network connection to see if you can resolve the issue.Please try to restart your application and check your network connection to see if you can resolve the issue.
Restarted application and funds appeared in wallet alongside other UTXOs I previously funded. Sent all funds to private wallet address but the tx never broadcast to the mempool. TXID: c169bbea2e86cabde9a0544fdd0c7e33c64adaec30fd78bf568b942b69c4cef0
Now funds in BISQ wallet show 0 and missing .045+ BTC
Saved error logs if needed. Would rather not post entire log in clearnet.
|
1.0
|
Error Publishing Transaction - Using BISQ v1.1.5
Ubuntu 16.04
When trying to fund a transaction with BTC received the following error:
An error occurred when taking the offer.
An error occurred at task: TakerPublishFeeTx
Exception message: We got an onFailure from the peerGroup.broadcastTransaction callback.
Please try to restart your application and check your network connection to see if you can resolve the issue.Please try to restart your application and check your network connection to see if you can resolve the issue.
Restarted application and funds appeared in wallet alongside other UTXOs I previously funded. Sent all funds to private wallet address but the tx never broadcast to the mempool. TXID: c169bbea2e86cabde9a0544fdd0c7e33c64adaec30fd78bf568b942b69c4cef0
Now funds in BISQ wallet show 0 and missing .045+ BTC
Saved error logs if needed. Would rather not post entire log in clearnet.
|
process
|
error publishing transaction using bisq ubuntu when trying to fund a transaction with btc received the following error an error occurred when taking the offer an error occurred at task takerpublishfeetx exception message we got an onfailure from the peergroup broadcasttransaction callback please try to restart your application and check your network connection to see if you can resolve the issue please try to restart your application and check your network connection to see if you can resolve the issue restarted application and funds appeared in wallet alongside other utxos i previously funded sent all funds to private wallet address but the tx never broadcast to the mempool txid now funds in bisq wallet show and missing btc saved error logs if needed would rather not post entire log in clearnet
| 1
|
7,895
| 11,083,289,492
|
IssuesEvent
|
2019-12-13 14:09:59
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR: mRNA alternative polyadenylation
|
New term request RNA processes editors-discussion regulation waiting for feedback
|
Dear GO editors,
Could you please consider to create a GO term, "mRNA alternative polyadenylation",
for a new process described in PMID:29276085.
That paper shows that the cleavage factor Im (CFIm) complex,
composed of NUDT21/CPSF5, CPSF6 and CPSF7, acts as a positive regulator in
alternative polyadenylation (APA) of pre-mRNAs.
This term would be a child of "mRNA polyadenylation" (GO:0006378).
In the mean time, could you also create regulation GO terms for it.
Thanks a lot for your help
Andre stutz
|
1.0
|
NTR: mRNA alternative polyadenylation - Dear GO editors,
Could you please consider to create a GO term, "mRNA alternative polyadenylation",
for a new process described in PMID:29276085.
That paper shows that the cleavage factor Im (CFIm) complex,
composed of NUDT21/CPSF5, CPSF6 and CPSF7, acts as a positive regulator in
alternative polyadenylation (APA) of pre-mRNAs.
This term would be a child of "mRNA polyadenylation" (GO:0006378).
In the mean time, could you also create regulation GO terms for it.
Thanks a lot for your help
Andre stutz
|
process
|
ntr mrna alternative polyadenylation dear go editors could you please consider to create a go term mrna alternative polyadenylation for a new process described in pmid that paper shows that the cleavage factor im cfim complex composed of and acts as a positive regulator in alternative polyadenylation apa of pre mrnas this term would be a child of mrna polyadenylation go in the mean time could you also create regulation go terms for it thanks a lot for your help andre stutz
| 1
|
110,881
| 4,443,438,746
|
IssuesEvent
|
2016-08-19 16:43:36
|
Automattic/mongoose
|
https://api.github.com/repos/Automattic/mongoose
|
closed
|
Sub-document validation doesn't work with runValidators
|
priority
|
runValidators seems to not work with subdocuments
```
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
var Schema = mongoose.Schema;
mongoose.connect('mongodb://localhost/test');
var FileSchema = new Schema({
name: String
});
var CompanySchema = new Schema({
name: String,
file: FileSchema
});
var Company = mongoose.model('Company', CompanySchema);
Company.create({
name: 'Mapple'
})
.then(function(company) {
return Company.update({
_id: company._id
}, {
file: {
name: 'new-name'
}
}, {
runValidators: true
});
});
```
Stack trace
```
Unhandled rejection TypeError: value.validate is not a function
at .../node_modules/mongoose/lib/schema/embedded.js:151:11
at SchemaType.doValidate (.../node_modules/mongoose/lib/schematype.js:694:12)
at SchemaType.Embedded.doValidate (.../node_modules/mongoose/lib/schema/embedded.js:144:35)
at .../node_modules/mongoose/lib/services/updateValidators.js:71:20
at .../node_modules/mongoose/node_modules/async/lib/async.js:718:13
at async.forEachOf.async.eachOf (.../node_modules/mongoose/node_modules/async/lib/async.js:233:13)
at _parallel (.../node_modules/mongoose/node_modules/async/lib/async.js:717:9)
at Object.async.parallel (.../node_modules/mongoose/node_modules/async/lib/async.js:731:9)
at .../node_modules/mongoose/lib/services/updateValidators.js:90:11
at Query._execUpdate (.../node_modules/mongoose/lib/query.js:1984:7)
at .../node_modules/mongoose/node_modules/kareem/index.js:239:8
at .../node_modules/mongoose/node_modules/kareem/index.js:18:7
at nextTickCallbackWith0Args (node.js:420:9)
at process._tickCallback (node.js:349:13)
```
mongoose version: 4.5.9
mongodb version: 3.2.4
|
1.0
|
Sub-document validation doesn't work with runValidators - runValidators seems to not work with subdocuments
```
var mongoose = require('mongoose');
mongoose.Promise = require('bluebird');
var Schema = mongoose.Schema;
mongoose.connect('mongodb://localhost/test');
var FileSchema = new Schema({
name: String
});
var CompanySchema = new Schema({
name: String,
file: FileSchema
});
var Company = mongoose.model('Company', CompanySchema);
Company.create({
name: 'Mapple'
})
.then(function(company) {
return Company.update({
_id: company._id
}, {
file: {
name: 'new-name'
}
}, {
runValidators: true
});
});
```
Stack trace
```
Unhandled rejection TypeError: value.validate is not a function
at .../node_modules/mongoose/lib/schema/embedded.js:151:11
at SchemaType.doValidate (.../node_modules/mongoose/lib/schematype.js:694:12)
at SchemaType.Embedded.doValidate (.../node_modules/mongoose/lib/schema/embedded.js:144:35)
at .../node_modules/mongoose/lib/services/updateValidators.js:71:20
at .../node_modules/mongoose/node_modules/async/lib/async.js:718:13
at async.forEachOf.async.eachOf (.../node_modules/mongoose/node_modules/async/lib/async.js:233:13)
at _parallel (.../node_modules/mongoose/node_modules/async/lib/async.js:717:9)
at Object.async.parallel (.../node_modules/mongoose/node_modules/async/lib/async.js:731:9)
at .../node_modules/mongoose/lib/services/updateValidators.js:90:11
at Query._execUpdate (.../node_modules/mongoose/lib/query.js:1984:7)
at .../node_modules/mongoose/node_modules/kareem/index.js:239:8
at .../node_modules/mongoose/node_modules/kareem/index.js:18:7
at nextTickCallbackWith0Args (node.js:420:9)
at process._tickCallback (node.js:349:13)
```
mongoose version: 4.5.9
mongodb version: 3.2.4
|
non_process
|
sub document validation doesn t work with runvalidators runvalidators seems to not work with subdocuments var mongoose require mongoose mongoose promise require bluebird var schema mongoose schema mongoose connect mongodb localhost test var fileschema new schema name string var companyschema new schema name string file fileschema var company mongoose model company companyschema company create name mapple then function company return company update id company id file name new name runvalidators true stack trace unhandled rejection typeerror value validate is not a function at node modules mongoose lib schema embedded js at schematype dovalidate node modules mongoose lib schematype js at schematype embedded dovalidate node modules mongoose lib schema embedded js at node modules mongoose lib services updatevalidators js at node modules mongoose node modules async lib async js at async foreachof async eachof node modules mongoose node modules async lib async js at parallel node modules mongoose node modules async lib async js at object async parallel node modules mongoose node modules async lib async js at node modules mongoose lib services updatevalidators js at query execupdate node modules mongoose lib query js at node modules mongoose node modules kareem index js at node modules mongoose node modules kareem index js at node js at process tickcallback node js mongoose version mongodb version
| 0
|
7,021
| 10,170,471,573
|
IssuesEvent
|
2019-08-08 05:28:28
|
jupyter/nbconvert
|
https://api.github.com/repos/jupyter/nbconvert
|
closed
|
Save widget models with Execute preprocessor?
|
Preprocessor:Execute
|
I'm using a notebook & nbsphinx to document my custom widget. This works fine if I run the notebook by hand and explicitly "save widget state". But it doesn't work if I use nbsphinx to execute the notebook, because the execute preprocessor doesn't save widget states (comm messages are discarded [here](https://github.com/jupyter/nbconvert/blob/master/nbconvert/preprocessors/execute.py#L411)).
From a quick look it doesn't look like it would be too hard to dump these comm messages and save the `data` of the message into the `widgets` part of the notebook document. Is that a good idea?
|
1.0
|
Save widget models with Execute preprocessor? - I'm using a notebook & nbsphinx to document my custom widget. This works fine if I run the notebook by hand and explicitly "save widget state". But it doesn't work if I use nbsphinx to execute the notebook, because the execute preprocessor doesn't save widget states (comm messages are discarded [here](https://github.com/jupyter/nbconvert/blob/master/nbconvert/preprocessors/execute.py#L411)).
From a quick look it doesn't look like it would be too hard to dump these comm messages and save the `data` of the message into the `widgets` part of the notebook document. Is that a good idea?
|
process
|
save widget models with execute preprocessor i m using a notebook nbsphinx to document my custom widget this works fine if i run the notebook by hand and explicitly save widget state but it doesn t work if i use nbsphinx to execute the notebook because the execute preprocessor doesn t save widget states comm messages are discarded from a quick look it doesn t look like it would be too hard to dump these comm messages and save the data of the message into the widgets part of the notebook document is that a good idea
| 1
|
15,692
| 19,848,078,312
|
IssuesEvent
|
2022-01-21 09:11:21
|
ooi-data/RS01SBPS-SF01A-3C-PARADA101-streamed-parad_sa_sample
|
https://api.github.com/repos/ooi-data/RS01SBPS-SF01A-3C-PARADA101-streamed-parad_sa_sample
|
opened
|
🛑 Processing failed: ValueError
|
process
|
## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T09:11:20.724978.
## Details
Flow name: `RS01SBPS-SF01A-3C-PARADA101-streamed-parad_sa_sample`
Task name: `processing_task`
Error type: `ValueError`
Error message: cannot reshape array of size 1209600 into shape (2777778,)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2341, in _append_nosync
self[append_selection] = data
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1224, in __setitem__
self.set_basic_selection(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1319, in set_basic_selection
return self._set_basic_selection_nd(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1610, in _set_basic_selection_nd
self._set_selection(indexer, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1682, in _set_selection
self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in _chunk_setitems
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in <listcomp>
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1950, in _process_for_setitem
chunk = self._decode_chunk(cdata)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2003, in _decode_chunk
chunk = chunk.reshape(expected_shape or self._chunks, order=self._order)
ValueError: cannot reshape array of size 1209600 into shape (2777778,)
```
</details>
|
1.0
|
🛑 Processing failed: ValueError - ## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T09:11:20.724978.
## Details
Flow name: `RS01SBPS-SF01A-3C-PARADA101-streamed-parad_sa_sample`
Task name: `processing_task`
Error type: `ValueError`
Error message: cannot reshape array of size 1209600 into shape (2777778,)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2341, in _append_nosync
self[append_selection] = data
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1224, in __setitem__
self.set_basic_selection(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1319, in set_basic_selection
return self._set_basic_selection_nd(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1610, in _set_basic_selection_nd
self._set_selection(indexer, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1682, in _set_selection
self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in _chunk_setitems
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in <listcomp>
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1950, in _process_for_setitem
chunk = self._decode_chunk(cdata)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2003, in _decode_chunk
chunk = chunk.reshape(expected_shape or self._chunks, order=self._order)
ValueError: cannot reshape array of size 1209600 into shape (2777778,)
```
</details>
|
process
|
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name streamed parad sa sample task name processing task error type valueerror error message cannot reshape array of size into shape traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages zarr core py line in append return self write op self append nosync data axis axis file srv conda envs notebook lib site packages zarr core py line in write op return self synchronized op f args kwargs file srv conda envs notebook lib site packages zarr core py line in synchronized op result f args kwargs file srv conda envs notebook lib site packages zarr core py line in append nosync self data file srv conda envs notebook lib site packages zarr core py line in setitem self set basic selection selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection return self set basic selection nd selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection nd self set selection indexer value fields fields file srv conda envs notebook lib site packages zarr core py line in set selection self chunk setitems lchunk coords lchunk selection chunk values file srv conda envs notebook lib site packages zarr core py line in chunk setitems cdatas self process for setitem key sel val fields fields file srv conda envs notebook lib site packages zarr core py line in cdatas self process for setitem key sel val fields fields file srv conda envs notebook lib site packages zarr core py line in process for setitem chunk self decode chunk cdata file srv conda envs notebook lib site packages zarr core py line in decode chunk chunk chunk reshape expected shape or self chunks order self order valueerror cannot reshape array of size into shape
| 1
|
13,745
| 16,500,838,557
|
IssuesEvent
|
2021-05-25 14:27:00
|
trpo2021/cw-ip-011_keyboardninja
|
https://api.github.com/repos/trpo2021/cw-ip-011_keyboardninja
|
opened
|
task for tests
|
in process
|
on functions duel in mode, send a value greater than three and check for an error. Do the same with difficulty
on function worlds and numbers create an empty file and check if it will open it + check if it will open a file that does not exist +slightly modify the code
on function prepare
check the difficulty of mode and difficulty for more than 3
|
1.0
|
task for tests - on functions duel in mode, send a value greater than three and check for an error. Do the same with difficulty
on function worlds and numbers create an empty file and check if it will open it + check if it will open a file that does not exist +slightly modify the code
on function prepare
check the difficulty of mode and difficulty for more than 3
|
process
|
task for tests on functions duel in mode send a value greater than three and check for an error do the same with difficulty on function worlds and numbers create an empty file and check if it will open it check if it will open a file that does not exist slightly modify the code on function prepare check the difficulty of mode and difficulty for more than
| 1
|
115,720
| 17,334,383,547
|
IssuesEvent
|
2021-07-28 08:25:08
|
lukebroganws/Umbraco-CMS
|
https://api.github.com/repos/lukebroganws/Umbraco-CMS
|
opened
|
CVE-2020-11023 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.12.4.js</b>, <b>jquery-1.9.1.js</b>, <b>jquery-1.11.1.min.js</b>, <b>jquery-2.0.2.min.js</b>, <b>jquery-2.1.3.min.js</b></p></summary>
<p>
<details><summary><b>jquery-1.12.4.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.12.4/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.12.4/jquery.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/index.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/external/jquery/jquery.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.12.4.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/index.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/../docs/jquery-1.9.1.js,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/docs/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.11.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.0.2.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/bootstrap-social/assets/js/jquery.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.0.2.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.3.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/tmp/bower_components/chart.js/samples/bar-horizontal.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.3.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/lukebroganws/Umbraco-CMS/commit/24bd18757bbbe3324d85424fdabf1d6bdaf1695e">24bd18757bbbe3324d85424fdabf1d6bdaf1695e</a></p>
<p>Found in base branch: <b>v8/contrib</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440">https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0;jquery-rails - 4.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.12.4","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.12.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/index.html","/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.0.2","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html","/src/Umbraco.Web.UI.Client/node_modules/bootstrap-social/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:2.0.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.3","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html","/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/tmp/bower_components/chart.js/samples/bar-horizontal.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:2.1.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"}],"baseBranches":["v8/contrib"],"vulnerabilityIdentifier":"CVE-2020-11023","vulnerabilityDetails":"In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing \u003coption\u003e elements from untrusted sources - even after sanitizing it - to one of jQuery\u0027s DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-11023 (Medium) detected in multiple libraries - ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.12.4.js</b>, <b>jquery-1.9.1.js</b>, <b>jquery-1.11.1.min.js</b>, <b>jquery-2.0.2.min.js</b>, <b>jquery-2.1.3.min.js</b></p></summary>
<p>
<details><summary><b>jquery-1.12.4.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.12.4/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.12.4/jquery.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/index.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/external/jquery/jquery.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.12.4.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/index.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/../docs/jquery-1.9.1.js,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/docs/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.11.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.0.2.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/bootstrap-social/assets/js/jquery.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.0.2.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.3.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js</a></p>
<p>Path to dependency file: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html</p>
<p>Path to vulnerable library: Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html,Umbraco-CMS/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/tmp/bower_components/chart.js/samples/bar-horizontal.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.3.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/lukebroganws/Umbraco-CMS/commit/24bd18757bbbe3324d85424fdabf1d6bdaf1695e">24bd18757bbbe3324d85424fdabf1d6bdaf1695e</a></p>
<p>Found in base branch: <b>v8/contrib</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440">https://github.com/jquery/jquery/security/advisories/GHSA-jpcq-cgw6-v4j6,https://github.com/rails/jquery-rails/blob/master/CHANGELOG.md#440</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0;jquery-rails - 4.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.12.4","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/jquery-ui-dist/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.12.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/test/index.html","/src/Umbraco.Web.UI.Client/node_modules/spectrum-colorpicker2/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/lazyload-js/test/node_modules/express/node_modules/qs/support/expresso/docs/api.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.0.2","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/ace-builds/demo/whitespace .html","/src/Umbraco.Web.UI.Client/node_modules/bootstrap-social/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:2.0.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"},{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.3","packageFilePaths":["/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/node_modules/chart.js/samples/radar-skip-points.html","/src/Umbraco.Web.UI.Client/node_modules/angular-chart.js/tmp/bower_components/chart.js/samples/bar-horizontal.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:2.1.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jquery - 3.5.0;jquery-rails - 4.4.0"}],"baseBranches":["v8/contrib"],"vulnerabilityIdentifier":"CVE-2020-11023","vulnerabilityDetails":"In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing \u003coption\u003e elements from untrusted sources - even after sanitizing it - to one of jQuery\u0027s DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries jquery js jquery js jquery min js jquery min js jquery min js jquery js javascript library for dom operations library home page a href path to dependency file umbraco cms src umbraco web ui client node modules jquery ui dist index html path to vulnerable library umbraco cms src umbraco web ui client node modules jquery ui dist external jquery jquery js dependency hierarchy x jquery js vulnerable library jquery js javascript library for dom operations library home page a href path to dependency file umbraco cms src umbraco web ui client node modules spectrum test index html path to vulnerable library umbraco cms src umbraco web ui client node modules spectrum test docs jquery js umbraco cms src umbraco web ui client node modules spectrum docs jquery js dependency hierarchy x jquery js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file umbraco cms src umbraco web ui client node modules lazyload js test node modules express node modules qs support expresso docs api html path to vulnerable library umbraco cms src umbraco web ui client node modules lazyload js test node modules express node modules qs support expresso docs api html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file umbraco cms src umbraco web ui client node modules ace builds demo whitespace html path to vulnerable library umbraco cms src umbraco web ui client node modules ace builds demo whitespace html umbraco cms src umbraco web ui client node modules bootstrap social assets js jquery js dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file umbraco cms src umbraco web ui client node modules angular chart js node modules chart js samples radar skip points html path to vulnerable library umbraco cms src umbraco web ui client node modules angular chart js node modules chart js samples radar skip points html umbraco cms src umbraco web ui client node modules angular chart js tmp bower components chart js samples bar horizontal html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch contrib vulnerability details in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery jquery rails isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery jquery rails packagetype javascript packagename jquery packageversion packagefilepaths istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery jquery rails packagetype javascript packagename jquery packageversion packagefilepaths istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery jquery rails packagetype javascript packagename jquery packageversion packagefilepaths istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery jquery rails packagetype javascript packagename jquery packageversion packagefilepaths istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery jquery rails basebranches vulnerabilityidentifier cve vulnerabilitydetails in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery vulnerabilityurl
| 0
|
4,526
| 7,371,333,719
|
IssuesEvent
|
2018-03-13 11:23:19
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Download link not working properly
|
cxp doc-bug in-process security triaged
|
I tried clicking the link in Chrome and there's no download initiated.
I tried clicking it in Edge and there's no apparent action happening but checking the download section in the hub I can see it there but I had to go look for it. Also, the download didn't start for me, just waiting.
I finally opened it in Safari on my iPhone from where I could save the application to my OneDrive.
I could then open that from my computer and install.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7dea734e-e546-2acd-3a6c-c0a8390f9ee9
* Version Independent ID: 3a4c0ad1-7a2f-9590-9b96-e1285ac9769c
* Content: [Getting Started - Microsoft Threat Modeling Tool - Azure | Microsoft Docs](https://docs.microsoft.com/en-us/azure/security/azure-security-threat-modeling-tool-getting-started)
* Content Source: [articles/security/azure-security-threat-modeling-tool-getting-started.md](https://github.com/Microsoft/azure-docs/blob/master/articles/security/azure-security-threat-modeling-tool-getting-started.md)
* Service: **security**
* GitHub Login: @rodsan
* Microsoft Alias: **rodsan**
|
1.0
|
Download link not working properly - I tried clicking the link in Chrome and there's no download initiated.
I tried clicking it in Edge and there's no apparent action happening but checking the download section in the hub I can see it there but I had to go look for it. Also, the download didn't start for me, just waiting.
I finally opened it in Safari on my iPhone from where I could save the application to my OneDrive.
I could then open that from my computer and install.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7dea734e-e546-2acd-3a6c-c0a8390f9ee9
* Version Independent ID: 3a4c0ad1-7a2f-9590-9b96-e1285ac9769c
* Content: [Getting Started - Microsoft Threat Modeling Tool - Azure | Microsoft Docs](https://docs.microsoft.com/en-us/azure/security/azure-security-threat-modeling-tool-getting-started)
* Content Source: [articles/security/azure-security-threat-modeling-tool-getting-started.md](https://github.com/Microsoft/azure-docs/blob/master/articles/security/azure-security-threat-modeling-tool-getting-started.md)
* Service: **security**
* GitHub Login: @rodsan
* Microsoft Alias: **rodsan**
|
process
|
download link not working properly i tried clicking the link in chrome and there s no download initiated i tried clicking it in edge and there s no apparent action happening but checking the download section in the hub i can see it there but i had to go look for it also the download didn t start for me just waiting i finally opened it in safari on my iphone from where i could save the application to my onedrive i could then open that from my computer and install document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service security github login rodsan microsoft alias rodsan
| 1
|
4,299
| 7,194,265,858
|
IssuesEvent
|
2018-02-04 02:10:32
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Can't send signals using signal numbers
|
doc process
|
* **Version**: v9.5.0
* **Platform**: Linux arch 4.14.15-1-ARCH #1 SMP PREEMPT Tue Jan 23 21:49:25 UTC 2018 x86_64 GNU/Linux
* **Subsystem**: process
Using `process.kill` with signal numbers should work according to the documentation, but I get an ERR_UNKNOWN_SIGNAL error when using numbers instead of strings. For example:
$ node
> process.kill(process.pid, "SIGTERM")
Terminated
$ node
> 15 === require("constants").SIGTERM
true
> process.kill(process.pid, 15)
TypeError [ERR_UNKNOWN_SIGNAL]: Unknown signal: 15
at process.kill (internal/process.js:168:15)
|
1.0
|
Can't send signals using signal numbers - * **Version**: v9.5.0
* **Platform**: Linux arch 4.14.15-1-ARCH #1 SMP PREEMPT Tue Jan 23 21:49:25 UTC 2018 x86_64 GNU/Linux
* **Subsystem**: process
Using `process.kill` with signal numbers should work according to the documentation, but I get an ERR_UNKNOWN_SIGNAL error when using numbers instead of strings. For example:
$ node
> process.kill(process.pid, "SIGTERM")
Terminated
$ node
> 15 === require("constants").SIGTERM
true
> process.kill(process.pid, 15)
TypeError [ERR_UNKNOWN_SIGNAL]: Unknown signal: 15
at process.kill (internal/process.js:168:15)
|
process
|
can t send signals using signal numbers version platform linux arch arch smp preempt tue jan utc gnu linux subsystem process using process kill with signal numbers should work according to the documentation but i get an err unknown signal error when using numbers instead of strings for example node process kill process pid sigterm terminated node require constants sigterm true process kill process pid typeerror unknown signal at process kill internal process js
| 1
|
18,960
| 24,921,093,733
|
IssuesEvent
|
2022-10-31 00:00:00
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
NTR: Retinoic acid-induced apoptosis
|
New term request cellular processes
|
Please provide as much information as you can:
* **Suggested term label:**
Retinoic acid-induced apoptosis
* **Definition (free text)**
Retinoic acid-induced apoptosis
* **Reference, in format PMID:#######**
PMID: 9187263
PMID: 16568081
PMID: 33190441
PMID: 18474417
PMID: 17869331 ; close to suggest a physiological role of ATRA in development
* **Gene product name and ID to be annotated to this term**
BCL-2 [P10415](https://www.uniprot.org/uniprotkb/P10415/entry)
RARG [P13631](https://www.uniprot.org/uniprotkb/P13631/entry)
FLT3 [P36888](https://www.uniprot.org/uniprotkb/P36888/entry)
ERK1 [P27361](https://www.uniprot.org/uniprotkb/P27361/entry) # MAKP3
ERK2 [P28482](https://www.uniprot.org/uniprotkb/P28482/entry) # MAPK1
* **Parent term(s)**
GO:0071300 (cellular response to retinoic acid)
GO:0071887 (leukocyte apoptotic process)
GO:0043065 (positive regulation of apoptotic process)
GO:0048384 (retinoic acid receptor signaling pathway)
* **Children terms (if applicable)** Should any existing terms that should be moved underneath this new proposed term?
N/A
* **Synonyms (please specify, EXACT, BROAD, NARROW or RELATED)**
Don't know any.
* **Cross-references**
All-trans-retinoic-acid (ATRA):
* https://www.ebi.ac.uk/chebi/searchId.do?chebiId=CHEBI:45376
* https://reactome.org/content/detail/R-ICO-013649
* https://en.wikipedia.org/wiki/Tretinoin
* **Any other information**
This term is of clinical (oncological) relevance. The genes listed are incomplete, not sure if one should separate the induction of apoptosis with retinoids and particular markers of apoptosis when induced by retinoids. The Wikipedia page on https://en.wikipedia.org/wiki/Tretinoin presents an overview on clinical applications.
|
1.0
|
NTR: Retinoic acid-induced apoptosis - Please provide as much information as you can:
* **Suggested term label:**
Retinoic acid-induced apoptosis
* **Definition (free text)**
Retinoic acid-induced apoptosis
* **Reference, in format PMID:#######**
PMID: 9187263
PMID: 16568081
PMID: 33190441
PMID: 18474417
PMID: 17869331 ; close to suggest a physiological role of ATRA in development
* **Gene product name and ID to be annotated to this term**
BCL-2 [P10415](https://www.uniprot.org/uniprotkb/P10415/entry)
RARG [P13631](https://www.uniprot.org/uniprotkb/P13631/entry)
FLT3 [P36888](https://www.uniprot.org/uniprotkb/P36888/entry)
ERK1 [P27361](https://www.uniprot.org/uniprotkb/P27361/entry) # MAKP3
ERK2 [P28482](https://www.uniprot.org/uniprotkb/P28482/entry) # MAPK1
* **Parent term(s)**
GO:0071300 (cellular response to retinoic acid)
GO:0071887 (leukocyte apoptotic process)
GO:0043065 (positive regulation of apoptotic process)
GO:0048384 (retinoic acid receptor signaling pathway)
* **Children terms (if applicable)** Should any existing terms that should be moved underneath this new proposed term?
N/A
* **Synonyms (please specify, EXACT, BROAD, NARROW or RELATED)**
Don't know any.
* **Cross-references**
All-trans-retinoic-acid (ATRA):
* https://www.ebi.ac.uk/chebi/searchId.do?chebiId=CHEBI:45376
* https://reactome.org/content/detail/R-ICO-013649
* https://en.wikipedia.org/wiki/Tretinoin
* **Any other information**
This term is of clinical (oncological) relevance. The genes listed are incomplete, not sure if one should separate the induction of apoptosis with retinoids and particular markers of apoptosis when induced by retinoids. The Wikipedia page on https://en.wikipedia.org/wiki/Tretinoin presents an overview on clinical applications.
|
process
|
ntr retinoic acid induced apoptosis please provide as much information as you can suggested term label retinoic acid induced apoptosis definition free text retinoic acid induced apoptosis reference in format pmid pmid pmid pmid pmid pmid close to suggest a physiological role of atra in development gene product name and id to be annotated to this term bcl rarg parent term s go cellular response to retinoic acid go leukocyte apoptotic process go positive regulation of apoptotic process go retinoic acid receptor signaling pathway children terms if applicable should any existing terms that should be moved underneath this new proposed term n a synonyms please specify exact broad narrow or related don t know any cross references all trans retinoic acid atra any other information this term is of clinical oncological relevance the genes listed are incomplete not sure if one should separate the induction of apoptosis with retinoids and particular markers of apoptosis when induced by retinoids the wikipedia page on presents an overview on clinical applications
| 1
|
331,616
| 10,075,557,724
|
IssuesEvent
|
2019-07-24 14:29:17
|
mojaloop/project
|
https://api.github.com/repos/mojaloop/project
|
closed
|
Transfer Timeouts are not re-setting positions and status
|
Priority: High bug
|
**Summary**:
Transfer Timeouts are not re-setting positions and status.
Golden Path QA Framework `transfer_timeout` failing:
Cause seems to be due to an error on the Position Handler as follows:
```
2019-07-18T14:24:45.027Z - info: PositionHandler::positions
2019-07-18T14:24:45.028Z - info: PositionHandler::positions::timeout
2019-07-18T14:24:45.029Z - error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
2019-07-18T14:24:45.029Z - error: PositionHandler::positions::timeout::Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?--0
2019-07-18T14:24:45.029Z - error: Consumer::consume()::syncQueue.queue - error: Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
2019-07-18T14:24:45.029Z - error: Consumer::onError()[topics='topic-transfer-position'] - Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
at QueryCompiler_MySQL.toSQL (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/query/compiler.js:85:13)
at Builder.toSQL (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/query/builder.js:72:44)
at /opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/runner.js:37:34
at tryCatcher (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/util.js:16:23)
at /opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/using.js:185:26
at tryCatcher (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:512:31)
at Promise._settlePromise (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:569:18)
at Promise._settlePromise0 (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:694:18)
at Promise._fulfill (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:638:18)
at PromiseArray._resolve (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise_array.js:126:19)
at PromiseArray._promiseFulfilled (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise_array.js:144:14)
at Promise._settlePromise (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:574:26)
at Promise._settlePromise0 (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:694:18)
at _drainQueueStep (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:138:12)
at _drainQueue (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:131:9)
at Async._drainQueues (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:147:5)
at Immediate.Async.drainQueues [as _onImmediate] (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:17:14)
at runCallback (timers.js:705:18)
at tryOnImmediate (timers.js:676:5)
at processImmediate (timers.js:658:5))
2019-07-18T14:24:45.029Z - error: Consumer::consume() - error Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
```
**Severity**:
High
**Priority**:
Critical
**Expected Behavior**
**Steps to Reproduce**
1. Store Payerfsp position before prepare:
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: acadede1-3d20-4a87-be28-135582115052' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488,
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
2. Send prepare
```CURL
curl -X POST \
http://dev2-ml-api-adapter.mojaloop.live/transfers \
-H 'Accept: application/vnd.interoperability.transfers+json;version=1' \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'Content-Type: application/vnd.interoperability.transfers+json;version=1' \
-H 'Date: Thu, 24 Jan 2019 10:22:12 GMT' \
-H 'FSPIOP-Destination: noresponsepayeefsp' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 751dcac0-0a87-4d81-af1d-ecb7f44d8e2a' \
-H 'cache-control: no-cache' \
-d '{
"transferId": "f999c714-7665-4390-a0ea-4a6767bbb973",
"payerFsp": "payerfsp",
"payeeFsp": "noresponsepayeefsp",
"amount": {
"amount": "10",
"currency": "USD"
},
"expiration": "2019-07-18T14:24:38.091Z",
"ilpPacket": "AQAAAAAAAADIEHByaXZhdGUucGF5ZWVmc3CCAiB7InRyYW5zYWN0aW9uSWQiOiIyZGY3NzRlMi1mMWRiLTRmZjctYTQ5NS0yZGRkMzdhZjdjMmMiLCJxdW90ZUlkIjoiMDNhNjA1NTAtNmYyZi00NTU2LThlMDQtMDcwM2UzOWI4N2ZmIiwicGF5ZWUiOnsicGFydHlJZEluZm8iOnsicGFydHlJZFR5cGUiOiJNU0lTRE4iLCJwYXJ0eUlkZW50aWZpZXIiOiIyNzcxMzgwMzkxMyIsImZzcElkIjoicGF5ZWVmc3AifSwicGVyc29uYWxJbmZvIjp7ImNvbXBsZXhOYW1lIjp7fX19LCJwYXllciI6eyJwYXJ0eUlkSW5mbyI6eyJwYXJ0eUlkVHlwZSI6Ik1TSVNETiIsInBhcnR5SWRlbnRpZmllciI6IjI3NzEzODAzOTExIiwiZnNwSWQiOiJwYXllcmZzcCJ9LCJwZXJzb25hbEluZm8iOnsiY29tcGxleE5hbWUiOnt9fX0sImFtb3VudCI6eyJjdXJyZW5jeSI6IlVTRCIsImFtb3VudCI6IjIwMCJ9LCJ0cmFuc2FjdGlvblR5cGUiOnsic2NlbmFyaW8iOiJERVBPU0lUIiwic3ViU2NlbmFyaW8iOiJERVBPU0lUIiwiaW5pdGlhdG9yIjoiUEFZRVIiLCJpbml0aWF0b3JUeXBlIjoiQ09OU1VNRVIiLCJyZWZ1bmRJbmZvIjp7fX19",
"condition": "nMel-FDPpp3T77jfC11fUXdcy935hy089AJ9v2OTXBI"
}'
```
3. Check position before timeout
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 2efcca9f-2ed3-4589-9e3f-20ba4d1d6e31' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488,
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
4. Check Payerfsp position after timeout <-- Fails
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 4553c55e-9cf9-49f0-8d11-c0a814b3254e' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488, <-- Filed due to value being same as step 3. This should now be 478.
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
5. Check Transfer status - ABORTED
```
REQUEST:
curl -X GET \
http://dev2-ml-api-adapter.mojaloop.live/transfers/f999c714-7665-4390-a0ea-4a6767bbb973 \
-H 'Accept: application/vnd.interoperability.transfers+json;version=1' \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'Content-Type: application/vnd.interoperability.transfers+json;version=1.0' \
-H 'Date: Thu, 18 Jul 2019 14:24:08 GMT' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: b869606f-bc08-4600-9374-4851272b2f30' \
-H 'cache-control: no-cache'
RESPONSE:
{
completedTimestamp:"2019-07-18T14:24:45.000Z"
transferState:"RESERVED" <-- This failed as it should be "ABORTED"
}
```
**Tasks**
- [x] Add the missing `content.uriParams.id` field to the message in `src/handlers/timeouts/handler.js`
- [x] Add new integration and unit test(s)
- [ ] Create a new story to update the messages to use common/streaming factory method instead of the current ad-hoc approach.
** Pull Request**
- [ ] https://github.com/mojaloop/central-ledger/pull/331/files
**Specifications**
- Component (if known): Central-Ledger Position Handler
- Version: v7.1.2
- Platform: All
- Subsystem: n/a
- Type of testing: QA - Golden Path
- Bug found/raised by: @mdebarros
**Notes**:
- Severity when opened: High
- Priority when opened: Critical
|
1.0
|
Transfer Timeouts are not re-setting positions and status - **Summary**:
Transfer Timeouts are not re-setting positions and status.
Golden Path QA Framework `transfer_timeout` failing:
Cause seems to be due to an error on the Position Handler as follows:
```
2019-07-18T14:24:45.027Z - info: PositionHandler::positions
2019-07-18T14:24:45.028Z - info: PositionHandler::positions::timeout
2019-07-18T14:24:45.029Z - error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
2019-07-18T14:24:45.029Z - error: PositionHandler::positions::timeout::Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?--0
2019-07-18T14:24:45.029Z - error: Consumer::consume()::syncQueue.queue - error: Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
2019-07-18T14:24:45.029Z - error: Consumer::onError()[topics='topic-transfer-position'] - Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
at QueryCompiler_MySQL.toSQL (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/query/compiler.js:85:13)
at Builder.toSQL (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/query/builder.js:72:44)
at /opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/knex/lib/runner.js:37:34
at tryCatcher (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/util.js:16:23)
at /opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/using.js:185:26
at tryCatcher (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:512:31)
at Promise._settlePromise (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:569:18)
at Promise._settlePromise0 (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:694:18)
at Promise._fulfill (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:638:18)
at PromiseArray._resolve (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise_array.js:126:19)
at PromiseArray._promiseFulfilled (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise_array.js:144:14)
at Promise._settlePromise (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:574:26)
at Promise._settlePromise0 (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/promise.js:694:18)
at _drainQueueStep (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:138:12)
at _drainQueue (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:131:9)
at Async._drainQueues (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:147:5)
at Immediate.Async.drainQueues [as _onImmediate] (/opt/central-ledger/node_modules/@mojaloop/central-services-database/node_modules/bluebird/js/release/async.js:17:14)
at runCallback (timers.js:705:18)
at tryOnImmediate (timers.js:676:5)
at processImmediate (timers.js:658:5))
2019-07-18T14:24:45.029Z - error: Consumer::consume() - error Error: Undefined binding(s) detected when compiling FIRST query: select `transferParticipant`.*, `tsc`.`transferStateId`, `tsc`.`reason` from `transferParticipant` inner join `transferStateChange` as `tsc` on `tsc`.`transferId` = `transferParticipant`.`transferId` where `transferParticipant`.`transferId` = ? and `transferParticipant`.`transferParticipantRoleTypeId` = ? and `transferParticipant`.`ledgerEntryTypeId` = ? order by `tsc`.`transferStateChangeId` desc limit ?
```
**Severity**:
High
**Priority**:
Critical
**Expected Behavior**
**Steps to Reproduce**
1. Store Payerfsp position before prepare:
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: acadede1-3d20-4a87-be28-135582115052' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488,
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
2. Send prepare
```CURL
curl -X POST \
http://dev2-ml-api-adapter.mojaloop.live/transfers \
-H 'Accept: application/vnd.interoperability.transfers+json;version=1' \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'Content-Type: application/vnd.interoperability.transfers+json;version=1' \
-H 'Date: Thu, 24 Jan 2019 10:22:12 GMT' \
-H 'FSPIOP-Destination: noresponsepayeefsp' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 751dcac0-0a87-4d81-af1d-ecb7f44d8e2a' \
-H 'cache-control: no-cache' \
-d '{
"transferId": "f999c714-7665-4390-a0ea-4a6767bbb973",
"payerFsp": "payerfsp",
"payeeFsp": "noresponsepayeefsp",
"amount": {
"amount": "10",
"currency": "USD"
},
"expiration": "2019-07-18T14:24:38.091Z",
"ilpPacket": "AQAAAAAAAADIEHByaXZhdGUucGF5ZWVmc3CCAiB7InRyYW5zYWN0aW9uSWQiOiIyZGY3NzRlMi1mMWRiLTRmZjctYTQ5NS0yZGRkMzdhZjdjMmMiLCJxdW90ZUlkIjoiMDNhNjA1NTAtNmYyZi00NTU2LThlMDQtMDcwM2UzOWI4N2ZmIiwicGF5ZWUiOnsicGFydHlJZEluZm8iOnsicGFydHlJZFR5cGUiOiJNU0lTRE4iLCJwYXJ0eUlkZW50aWZpZXIiOiIyNzcxMzgwMzkxMyIsImZzcElkIjoicGF5ZWVmc3AifSwicGVyc29uYWxJbmZvIjp7ImNvbXBsZXhOYW1lIjp7fX19LCJwYXllciI6eyJwYXJ0eUlkSW5mbyI6eyJwYXJ0eUlkVHlwZSI6Ik1TSVNETiIsInBhcnR5SWRlbnRpZmllciI6IjI3NzEzODAzOTExIiwiZnNwSWQiOiJwYXllcmZzcCJ9LCJwZXJzb25hbEluZm8iOnsiY29tcGxleE5hbWUiOnt9fX0sImFtb3VudCI6eyJjdXJyZW5jeSI6IlVTRCIsImFtb3VudCI6IjIwMCJ9LCJ0cmFuc2FjdGlvblR5cGUiOnsic2NlbmFyaW8iOiJERVBPU0lUIiwic3ViU2NlbmFyaW8iOiJERVBPU0lUIiwiaW5pdGlhdG9yIjoiUEFZRVIiLCJpbml0aWF0b3JUeXBlIjoiQ09OU1VNRVIiLCJyZWZ1bmRJbmZvIjp7fX19",
"condition": "nMel-FDPpp3T77jfC11fUXdcy935hy089AJ9v2OTXBI"
}'
```
3. Check position before timeout
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 2efcca9f-2ed3-4589-9e3f-20ba4d1d6e31' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488,
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
4. Check Payerfsp position after timeout <-- Fails
```CURL
REQUEST:
curl -X GET \
http://dev2-central-ledger.mojaloop.live/participants/payerfsp/positions \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: 4553c55e-9cf9-49f0-8d11-c0a814b3254e' \
-H 'cache-control: no-cache'
RESPONSE:
[
{
"currency": "USD",
"value": 488, <-- Filed due to value being same as step 3. This should now be 478.
"changedDate": "2019-07-18T13:53:27.000Z"
}
]
```
5. Check Transfer status - ABORTED
```
REQUEST:
curl -X GET \
http://dev2-ml-api-adapter.mojaloop.live/transfers/f999c714-7665-4390-a0ea-4a6767bbb973 \
-H 'Accept: application/vnd.interoperability.transfers+json;version=1' \
-H 'Authorization: Bearer {{BEARER_TOKEN}}' \
-H 'Content-Type: application/vnd.interoperability.transfers+json;version=1.0' \
-H 'Date: Thu, 18 Jul 2019 14:24:08 GMT' \
-H 'FSPIOP-Source: payerfsp' \
-H 'Postman-Token: b869606f-bc08-4600-9374-4851272b2f30' \
-H 'cache-control: no-cache'
RESPONSE:
{
completedTimestamp:"2019-07-18T14:24:45.000Z"
transferState:"RESERVED" <-- This failed as it should be "ABORTED"
}
```
**Tasks**
- [x] Add the missing `content.uriParams.id` field to the message in `src/handlers/timeouts/handler.js`
- [x] Add new integration and unit test(s)
- [ ] Create a new story to update the messages to use common/streaming factory method instead of the current ad-hoc approach.
** Pull Request**
- [ ] https://github.com/mojaloop/central-ledger/pull/331/files
**Specifications**
- Component (if known): Central-Ledger Position Handler
- Version: v7.1.2
- Platform: All
- Subsystem: n/a
- Type of testing: QA - Golden Path
- Bug found/raised by: @mdebarros
**Notes**:
- Severity when opened: High
- Priority when opened: Critical
|
non_process
|
transfer timeouts are not re setting positions and status summary transfer timeouts are not re setting positions and status golden path qa framework transfer timeout failing cause seems to be due to an error on the position handler as follows info positionhandler positions info positionhandler positions timeout error undefined binding s detected when compiling first query select transferparticipant tsc transferstateid tsc reason from transferparticipant inner join transferstatechange as tsc on tsc transferid transferparticipant transferid where transferparticipant transferid and transferparticipant transferparticipantroletypeid and transferparticipant ledgerentrytypeid order by tsc transferstatechangeid desc limit error positionhandler positions timeout undefined binding s detected when compiling first query select transferparticipant tsc transferstateid tsc reason from transferparticipant inner join transferstatechange as tsc on tsc transferid transferparticipant transferid where transferparticipant transferid and transferparticipant transferparticipantroletypeid and transferparticipant ledgerentrytypeid order by tsc transferstatechangeid desc limit error consumer consume syncqueue queue error error undefined binding s detected when compiling first query select transferparticipant tsc transferstateid tsc reason from transferparticipant inner join transferstatechange as tsc on tsc transferid transferparticipant transferid where transferparticipant transferid and transferparticipant transferparticipantroletypeid and transferparticipant ledgerentrytypeid order by tsc transferstatechangeid desc limit error consumer onerror error undefined binding s detected when compiling first query select transferparticipant tsc transferstateid tsc reason from transferparticipant inner join transferstatechange as tsc on tsc transferid transferparticipant transferid where transferparticipant transferid and transferparticipant transferparticipantroletypeid and transferparticipant ledgerentrytypeid order by tsc transferstatechangeid desc limit at querycompiler mysql tosql opt central ledger node modules mojaloop central services database node modules knex lib query compiler js at builder tosql opt central ledger node modules mojaloop central services database node modules knex lib query builder js at opt central ledger node modules mojaloop central services database node modules knex lib runner js at trycatcher opt central ledger node modules mojaloop central services database node modules bluebird js release util js at opt central ledger node modules mojaloop central services database node modules bluebird js release using js at trycatcher opt central ledger node modules mojaloop central services database node modules bluebird js release util js at promise settlepromisefromhandler opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise settlepromise opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise settlepromises opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise fulfill opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promisearray resolve opt central ledger node modules mojaloop central services database node modules bluebird js release promise array js at promisearray promisefulfilled opt central ledger node modules mojaloop central services database node modules bluebird js release promise array js at promise settlepromise opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at promise settlepromises opt central ledger node modules mojaloop central services database node modules bluebird js release promise js at drainqueuestep opt central ledger node modules mojaloop central services database node modules bluebird js release async js at drainqueue opt central ledger node modules mojaloop central services database node modules bluebird js release async js at async drainqueues opt central ledger node modules mojaloop central services database node modules bluebird js release async js at immediate async drainqueues opt central ledger node modules mojaloop central services database node modules bluebird js release async js at runcallback timers js at tryonimmediate timers js at processimmediate timers js error consumer consume error error undefined binding s detected when compiling first query select transferparticipant tsc transferstateid tsc reason from transferparticipant inner join transferstatechange as tsc on tsc transferid transferparticipant transferid where transferparticipant transferid and transferparticipant transferparticipantroletypeid and transferparticipant ledgerentrytypeid order by tsc transferstatechangeid desc limit severity high priority critical expected behavior steps to reproduce store payerfsp position before prepare curl request curl x get h authorization bearer bearer token h fspiop source payerfsp h postman token h cache control no cache response currency usd value changeddate send prepare curl curl x post h accept application vnd interoperability transfers json version h authorization bearer bearer token h content type application vnd interoperability transfers json version h date thu jan gmt h fspiop destination noresponsepayeefsp h fspiop source payerfsp h postman token h cache control no cache d transferid payerfsp payerfsp payeefsp noresponsepayeefsp amount amount currency usd expiration ilppacket condition nmel check position before timeout curl request curl x get h authorization bearer bearer token h fspiop source payerfsp h postman token h cache control no cache response currency usd value changeddate check payerfsp position after timeout fails curl request curl x get h fspiop source payerfsp h postman token h cache control no cache response currency usd value filed due to value being same as step this should now be changeddate check transfer status aborted request curl x get h accept application vnd interoperability transfers json version h authorization bearer bearer token h content type application vnd interoperability transfers json version h date thu jul gmt h fspiop source payerfsp h postman token h cache control no cache response completedtimestamp transferstate reserved this failed as it should be aborted tasks add the missing content uriparams id field to the message in src handlers timeouts handler js add new integration and unit test s create a new story to update the messages to use common streaming factory method instead of the current ad hoc approach pull request specifications component if known central ledger position handler version platform all subsystem n a type of testing qa golden path bug found raised by mdebarros notes severity when opened high priority when opened critical
| 0
|
227,697
| 17,396,489,602
|
IssuesEvent
|
2021-08-02 14:03:45
|
echobind/bisonapp
|
https://api.github.com/repos/echobind/bisonapp
|
closed
|
Heroku Deploy Docs
|
deployment documentation heroku
|
Instead of maintaining/integrating with Heroku's CLI.
As a first pass opt to:
- [ ] add a Procfile (and any other config docs)
- [ ] add `docs/heroku-setup.md`
- [ ] If Heroku selected - Point the user to the generated markdown post-install. (docs/heroku-setup.md)
Note: this same pattern will be used for other deployment strategies such as `Render`, `Railway`, `Vercel` as to not make assumptions on deploy configurations.
Closes #138
Closes #139
Closes #49
|
1.0
|
Heroku Deploy Docs - Instead of maintaining/integrating with Heroku's CLI.
As a first pass opt to:
- [ ] add a Procfile (and any other config docs)
- [ ] add `docs/heroku-setup.md`
- [ ] If Heroku selected - Point the user to the generated markdown post-install. (docs/heroku-setup.md)
Note: this same pattern will be used for other deployment strategies such as `Render`, `Railway`, `Vercel` as to not make assumptions on deploy configurations.
Closes #138
Closes #139
Closes #49
|
non_process
|
heroku deploy docs instead of maintaining integrating with heroku s cli as a first pass opt to add a procfile and any other config docs add docs heroku setup md if heroku selected point the user to the generated markdown post install docs heroku setup md note this same pattern will be used for other deployment strategies such as render railway vercel as to not make assumptions on deploy configurations closes closes closes
| 0
|
95,688
| 3,955,175,046
|
IssuesEvent
|
2016-04-29 19:51:37
|
BugBusterSWE/documentation
|
https://api.github.com/repos/BugBusterSWE/documentation
|
closed
|
Scrivere scenari principali
|
Analist priority:medium usecase
|
Documento in cui si trova il problema:
Activity #302
Analisi dei Requisiti
Descrizione del problema:
Scrivere gli scenari principali mancanti
Link task: [https://bugbusters.teamwork.com/tasks/6461256](https://bugbusters.teamwork.com/tasks/6461256)
|
1.0
|
Scrivere scenari principali - Documento in cui si trova il problema:
Activity #302
Analisi dei Requisiti
Descrizione del problema:
Scrivere gli scenari principali mancanti
Link task: [https://bugbusters.teamwork.com/tasks/6461256](https://bugbusters.teamwork.com/tasks/6461256)
|
non_process
|
scrivere scenari principali documento in cui si trova il problema activity analisi dei requisiti descrizione del problema scrivere gli scenari principali mancanti link task
| 0
|
16,536
| 21,563,719,906
|
IssuesEvent
|
2022-05-01 14:55:48
|
jgraley/inferno-cpp2v
|
https://api.github.com/repos/jgraley/inferno-cpp2v
|
closed
|
CSP test
|
Testing Constraint Processing
|
### The testing problem
We need to be able to "zoom in" on incorrect output from variations of the CSP solver class. There are 2 cases:
- Wrong solution generated. Since a reference solver generates all correct solutions, this will be "extra to" the ones the reference solver generates. However, it can be detected more easily as a solution that does not pass a reference `ConsistencyCheck()` with all the variables. Also, this case is likely to be rare, since `ConsistencyCheck()` would have to break or not get called. However, we can still check for this quite easily.
- Right solution not generated. This is trickier, since a large number of partial solutions are generated and then rejected. I think the thing to do is to identify when a partial solution _of a good, complete solution_ is rejected. See the Theory below.
In either of these cases, a fault would be identified quite soon after things have gone wrong, and the state of the solver is likely to prove informative. We can dump this out in an ASSERT failure (and maybe allow further probing using gdb).
### Architectural
We can define the `CSP::Solver` instance as the _MUT_ (Module Under Test). Then, we're relying on the `Constraint` class(s) to be correct since we can't really stub them, but I think they're quite stable now. The whole test should run inside the CSP's coroutine (unsure where we got to on stack backtraces, but surely gdb can fiddle about with the local stack frames...?), so we're going to insert everything at the `SovlerHolder`->`Solver` boundary.
So:
- We need a reference solver
- This is just the `SimpleSolver` renamed to `ReferenceSolver` and with the backtracking stuff turned off.
- The backtracking solver will have to be accomplished via a subclass of the `ReferenceSolver` that customizes. This may mean lots of vcalls.
- Want to leave `SolverHolder` alone for the most part, so we need a `SolverTestHarness` that also implements the `Solver` interface
- It creates (and controls access to) a `ReferenceSolver` and the MUT solver class we're testing
- Which means we have to drop `ReportageObserver` and use a lambda, to avoid the ambiguity when observing multiple things.
- When asked to perform a solve, it will drive the MUT once ll the way through as normal, but
- To detect wrong solutions, it will "double check" all MUT solutions using the reference's `ConsistencyCheck()` method, made available by side path, friending or something.
- To detect missing good solutions is harder
- `Solver` interface should be extended with a lambda for reporting rejected partial assignments. These should _not_ include the believed-bad assignment i.e. they should be consistent partial assignments.
- `SolverTestWrapper` also has to run the ref solver to completion at the start and store all of the solutions in an ideally-ordered container, _the ref solution set_.
- It may be OK to stop early - we only really need enough to catch the error
- `SolverTestWrapper` should act on rejections coming from MUT, by checking to see if there are any solutions in the ref set that _contain_ the rejected partial assignment. By the theory, this is a fail. We can stop here and dump state of the MUT solver.
### The theory
The theory is simply that once the MUT rejects a partial assignment Ap, it will increment one of the variables to its next value. I believe the value orderings are consistent enough that once this has happened, the MUT should not generate any solutions including Ap. So if we have _any_ solution Ac from the ref solver, and Ac include Ap, then the MUT will never generate Ac, even though it should.
There is the issue that MUT rejects Ap but then _does_ somehow come back to it and generate a solution Ac2 that includes it. Storing all the Ap will be a nightmare as there will be very many more of these than solutions. One possibility is to run MUT twice, first time to collect all _it's_ solutions. But I think not on this ticket.
|
1.0
|
CSP test - ### The testing problem
We need to be able to "zoom in" on incorrect output from variations of the CSP solver class. There are 2 cases:
- Wrong solution generated. Since a reference solver generates all correct solutions, this will be "extra to" the ones the reference solver generates. However, it can be detected more easily as a solution that does not pass a reference `ConsistencyCheck()` with all the variables. Also, this case is likely to be rare, since `ConsistencyCheck()` would have to break or not get called. However, we can still check for this quite easily.
- Right solution not generated. This is trickier, since a large number of partial solutions are generated and then rejected. I think the thing to do is to identify when a partial solution _of a good, complete solution_ is rejected. See the Theory below.
In either of these cases, a fault would be identified quite soon after things have gone wrong, and the state of the solver is likely to prove informative. We can dump this out in an ASSERT failure (and maybe allow further probing using gdb).
### Architectural
We can define the `CSP::Solver` instance as the _MUT_ (Module Under Test). Then, we're relying on the `Constraint` class(s) to be correct since we can't really stub them, but I think they're quite stable now. The whole test should run inside the CSP's coroutine (unsure where we got to on stack backtraces, but surely gdb can fiddle about with the local stack frames...?), so we're going to insert everything at the `SovlerHolder`->`Solver` boundary.
So:
- We need a reference solver
- This is just the `SimpleSolver` renamed to `ReferenceSolver` and with the backtracking stuff turned off.
- The backtracking solver will have to be accomplished via a subclass of the `ReferenceSolver` that customizes. This may mean lots of vcalls.
- Want to leave `SolverHolder` alone for the most part, so we need a `SolverTestHarness` that also implements the `Solver` interface
- It creates (and controls access to) a `ReferenceSolver` and the MUT solver class we're testing
- Which means we have to drop `ReportageObserver` and use a lambda, to avoid the ambiguity when observing multiple things.
- When asked to perform a solve, it will drive the MUT once ll the way through as normal, but
- To detect wrong solutions, it will "double check" all MUT solutions using the reference's `ConsistencyCheck()` method, made available by side path, friending or something.
- To detect missing good solutions is harder
- `Solver` interface should be extended with a lambda for reporting rejected partial assignments. These should _not_ include the believed-bad assignment i.e. they should be consistent partial assignments.
- `SolverTestWrapper` also has to run the ref solver to completion at the start and store all of the solutions in an ideally-ordered container, _the ref solution set_.
- It may be OK to stop early - we only really need enough to catch the error
- `SolverTestWrapper` should act on rejections coming from MUT, by checking to see if there are any solutions in the ref set that _contain_ the rejected partial assignment. By the theory, this is a fail. We can stop here and dump state of the MUT solver.
### The theory
The theory is simply that once the MUT rejects a partial assignment Ap, it will increment one of the variables to its next value. I believe the value orderings are consistent enough that once this has happened, the MUT should not generate any solutions including Ap. So if we have _any_ solution Ac from the ref solver, and Ac include Ap, then the MUT will never generate Ac, even though it should.
There is the issue that MUT rejects Ap but then _does_ somehow come back to it and generate a solution Ac2 that includes it. Storing all the Ap will be a nightmare as there will be very many more of these than solutions. One possibility is to run MUT twice, first time to collect all _it's_ solutions. But I think not on this ticket.
|
process
|
csp test the testing problem we need to be able to zoom in on incorrect output from variations of the csp solver class there are cases wrong solution generated since a reference solver generates all correct solutions this will be extra to the ones the reference solver generates however it can be detected more easily as a solution that does not pass a reference consistencycheck with all the variables also this case is likely to be rare since consistencycheck would have to break or not get called however we can still check for this quite easily right solution not generated this is trickier since a large number of partial solutions are generated and then rejected i think the thing to do is to identify when a partial solution of a good complete solution is rejected see the theory below in either of these cases a fault would be identified quite soon after things have gone wrong and the state of the solver is likely to prove informative we can dump this out in an assert failure and maybe allow further probing using gdb architectural we can define the csp solver instance as the mut module under test then we re relying on the constraint class s to be correct since we can t really stub them but i think they re quite stable now the whole test should run inside the csp s coroutine unsure where we got to on stack backtraces but surely gdb can fiddle about with the local stack frames so we re going to insert everything at the sovlerholder solver boundary so we need a reference solver this is just the simplesolver renamed to referencesolver and with the backtracking stuff turned off the backtracking solver will have to be accomplished via a subclass of the referencesolver that customizes this may mean lots of vcalls want to leave solverholder alone for the most part so we need a solvertestharness that also implements the solver interface it creates and controls access to a referencesolver and the mut solver class we re testing which means we have to drop reportageobserver and use a lambda to avoid the ambiguity when observing multiple things when asked to perform a solve it will drive the mut once ll the way through as normal but to detect wrong solutions it will double check all mut solutions using the reference s consistencycheck method made available by side path friending or something to detect missing good solutions is harder solver interface should be extended with a lambda for reporting rejected partial assignments these should not include the believed bad assignment i e they should be consistent partial assignments solvertestwrapper also has to run the ref solver to completion at the start and store all of the solutions in an ideally ordered container the ref solution set it may be ok to stop early we only really need enough to catch the error solvertestwrapper should act on rejections coming from mut by checking to see if there are any solutions in the ref set that contain the rejected partial assignment by the theory this is a fail we can stop here and dump state of the mut solver the theory the theory is simply that once the mut rejects a partial assignment ap it will increment one of the variables to its next value i believe the value orderings are consistent enough that once this has happened the mut should not generate any solutions including ap so if we have any solution ac from the ref solver and ac include ap then the mut will never generate ac even though it should there is the issue that mut rejects ap but then does somehow come back to it and generate a solution that includes it storing all the ap will be a nightmare as there will be very many more of these than solutions one possibility is to run mut twice first time to collect all it s solutions but i think not on this ticket
| 1
|
444,756
| 31,145,336,029
|
IssuesEvent
|
2023-08-16 05:50:53
|
dualra1n/dualra1n
|
https://api.github.com/repos/dualra1n/dualra1n
|
closed
|
Deepsleep fix uses a lot of power
|
bug documentation enhancement
|
Hello author, I am an ipad user and i have a problem with Deepsleep,After using Deepsleep fix or Fiona , although the problem was solved, but when my ipad goes into standby mode,It caused a huge power consumption. Over the course of one night, my ipad's battery dropped from 92 to 68,May I ask if there is any solution,thx!
By the way,My device is ipad mini4 wifi with dual boot ios14.0
|
1.0
|
Deepsleep fix uses a lot of power - Hello author, I am an ipad user and i have a problem with Deepsleep,After using Deepsleep fix or Fiona , although the problem was solved, but when my ipad goes into standby mode,It caused a huge power consumption. Over the course of one night, my ipad's battery dropped from 92 to 68,May I ask if there is any solution,thx!
By the way,My device is ipad mini4 wifi with dual boot ios14.0
|
non_process
|
deepsleep fix uses a lot of power hello author i am an ipad user and i have a problem with deepsleep after using deepsleep fix or fiona although the problem was solved but when my ipad goes into standby mode it caused a huge power consumption over the course of one night my ipad s battery dropped from to may i ask if there is any solution thx by the way my device is ipad wifi with dual boot
| 0
|
36,560
| 6,539,999,030
|
IssuesEvent
|
2017-09-01 13:50:02
|
umuc-cs/umuc-cs-slack
|
https://api.github.com/repos/umuc-cs/umuc-cs-slack
|
closed
|
Documentation Needed: README.md
|
difficulty/easy documentation
|
We need a README.md. Should include basic information on the group and reference the CONTRIBUTING.md for instructions for proper protocol for setting up and developing the application.
|
1.0
|
Documentation Needed: README.md - We need a README.md. Should include basic information on the group and reference the CONTRIBUTING.md for instructions for proper protocol for setting up and developing the application.
|
non_process
|
documentation needed readme md we need a readme md should include basic information on the group and reference the contributing md for instructions for proper protocol for setting up and developing the application
| 0
|
769,701
| 27,016,818,788
|
IssuesEvent
|
2023-02-10 20:14:16
|
tallyhowallet/extension
|
https://api.github.com/repos/tallyhowallet/extension
|
closed
|
Unable to connect to yearn
|
Type: Bug Status: Pending Priority: Medium
|
### Discord Discussion Link
_No response_
### What browsers are you seeing the problem on?
Chrome
### What were you trying to do?
Connect Tally to yearn!
### What did not work?
When Tally is enabled as default wallet... I'm not able to connect with Tally; rather I get a notice re: mm
<img width="1331" alt="Screen Shot 2022-02-23 at 4 31 02 PM" src="https://user-images.githubusercontent.com/7005061/155411994-e1717b7f-07c5-4ca7-95e4-4ac3f4fd1650.png">
https://user-images.githubusercontent.com/7005061/155412004-12fdd36f-b116-4665-95a3-5d6496ec9214.mov
### Version
_No response_
### Relevant log output
_No response_
|
1.0
|
Unable to connect to yearn - ### Discord Discussion Link
_No response_
### What browsers are you seeing the problem on?
Chrome
### What were you trying to do?
Connect Tally to yearn!
### What did not work?
When Tally is enabled as default wallet... I'm not able to connect with Tally; rather I get a notice re: mm
<img width="1331" alt="Screen Shot 2022-02-23 at 4 31 02 PM" src="https://user-images.githubusercontent.com/7005061/155411994-e1717b7f-07c5-4ca7-95e4-4ac3f4fd1650.png">
https://user-images.githubusercontent.com/7005061/155412004-12fdd36f-b116-4665-95a3-5d6496ec9214.mov
### Version
_No response_
### Relevant log output
_No response_
|
non_process
|
unable to connect to yearn discord discussion link no response what browsers are you seeing the problem on chrome what were you trying to do connect tally to yearn what did not work when tally is enabled as default wallet i m not able to connect with tally rather i get a notice re mm img width alt screen shot at pm src version no response relevant log output no response
| 0
|
316,848
| 9,658,071,711
|
IssuesEvent
|
2019-05-20 10:04:02
|
bbaumgartl/test
|
https://api.github.com/repos/bbaumgartl/test
|
closed
|
showSelect preselects multiple entrys in selectfield
|
Priority: Should have Status: Resolved Tracker: Bug
|
---
Author Name: **Jonas Götze** (Jonas Götze)
Original Redmine Issue: 10382, https://forge.typo3.org/issues/10382
Original Date: 2010-10-21
Original Assignee: Bernhard Baumgartl, datamints GmbH
---
Hi,
in class.tx_datamintsfeuser_pi1.php Line 1701
@$selected = (strpos($arrCurrentData[$fieldName], $row['uid']) !== false || in_array($row['uid'], $arrCurrentData[$fieldName])) ? ' selected="selected"' : '';@
the strpos-check sometimes results in multiple (and therefore wrong) options marked as selected.
In my case I have the following values for example
@$arrCurrentData[$fieldName] = 26
$row['uid'] = 2@
Which will become true as
@strpos('26','2'); // returns 1@
Though this option should not be marked as selected as it is not the correct one.
So while the correct option with
@$row['uid'] = 26@
is not the last one, something wrong will be selected in the output.
Why are you checking with both, strpos and with in_array()?
Regards
Jonas
|
1.0
|
showSelect preselects multiple entrys in selectfield - ---
Author Name: **Jonas Götze** (Jonas Götze)
Original Redmine Issue: 10382, https://forge.typo3.org/issues/10382
Original Date: 2010-10-21
Original Assignee: Bernhard Baumgartl, datamints GmbH
---
Hi,
in class.tx_datamintsfeuser_pi1.php Line 1701
@$selected = (strpos($arrCurrentData[$fieldName], $row['uid']) !== false || in_array($row['uid'], $arrCurrentData[$fieldName])) ? ' selected="selected"' : '';@
the strpos-check sometimes results in multiple (and therefore wrong) options marked as selected.
In my case I have the following values for example
@$arrCurrentData[$fieldName] = 26
$row['uid'] = 2@
Which will become true as
@strpos('26','2'); // returns 1@
Though this option should not be marked as selected as it is not the correct one.
So while the correct option with
@$row['uid'] = 26@
is not the last one, something wrong will be selected in the output.
Why are you checking with both, strpos and with in_array()?
Regards
Jonas
|
non_process
|
showselect preselects multiple entrys in selectfield author name jonas götze jonas götze original redmine issue original date original assignee bernhard baumgartl datamints gmbh hi in class tx datamintsfeuser php line selected strpos arrcurrentdata row false in array row arrcurrentdata selected selected the strpos check sometimes results in multiple and therefore wrong options marked as selected in my case i have the following values for example arrcurrentdata row which will become true as strpos returns though this option should not be marked as selected as it is not the correct one so while the correct option with row is not the last one something wrong will be selected in the output why are you checking with both strpos and with in array regards jonas
| 0
|
742,041
| 25,833,954,607
|
IssuesEvent
|
2022-12-12 18:06:02
|
VDP-noclip/noclip
|
https://api.github.com/repos/VDP-noclip/noclip
|
closed
|
Jump: less punitive and more forgiving!
|
high priority player urgent
|
After reading the first feedback, it's clear that the jump has a little problem (not that one related to the physics): it is "not that sensitive". I think this means that when the player makes a series of jumps it's hard to press SPACE right after the landing.
If i press SPACE just one frame before landing I don't deserve to be punished :(
I talked about this issue right before the demo was released with @stefanofossati
Possible fixes to this problem:
- make the game receive the jump input after AND before the landing (this is called **_jump input buffering_**).
- make the jump possibile also outside the platform (**_coyote time_**)
These things should make the avatar more responsive and under control.
|
1.0
|
Jump: less punitive and more forgiving! - After reading the first feedback, it's clear that the jump has a little problem (not that one related to the physics): it is "not that sensitive". I think this means that when the player makes a series of jumps it's hard to press SPACE right after the landing.
If i press SPACE just one frame before landing I don't deserve to be punished :(
I talked about this issue right before the demo was released with @stefanofossati
Possible fixes to this problem:
- make the game receive the jump input after AND before the landing (this is called **_jump input buffering_**).
- make the jump possibile also outside the platform (**_coyote time_**)
These things should make the avatar more responsive and under control.
|
non_process
|
jump less punitive and more forgiving after reading the first feedback it s clear that the jump has a little problem not that one related to the physics it is not that sensitive i think this means that when the player makes a series of jumps it s hard to press space right after the landing if i press space just one frame before landing i don t deserve to be punished i talked about this issue right before the demo was released with stefanofossati possible fixes to this problem make the game receive the jump input after and before the landing this is called jump input buffering make the jump possibile also outside the platform coyote time these things should make the avatar more responsive and under control
| 0
|
6,040
| 8,853,367,991
|
IssuesEvent
|
2019-01-08 21:09:08
|
LibraryOfCongress/concordia
|
https://api.github.com/repos/LibraryOfCongress/concordia
|
opened
|
Registered users can return back to review after submitting edits
|
review process
|
**Is your feature request related to a problem? Please describe.**
Review is two actions, edit or accept. If a user submits an edit there is no easy option to get back to review. There needs to be a pathway for users to continue reviewing if they choose.
**Additional context**
In #749 the option to keep in a review track when a transcription is accepted.
**Acceptance criteria**
- [ ] When editing and clicking submit, users are able to return to review or transcribe new page within modal
- [ ] Update modal to include options to transcribe new page or review a page
<img width="592" alt="screen shot 2019-01-08 at 4 06 08 pm" src="https://user-images.githubusercontent.com/7362915/50858773-5fb48e00-135f-11e9-9dfd-432e604f21e5.png">
|
1.0
|
Registered users can return back to review after submitting edits - **Is your feature request related to a problem? Please describe.**
Review is two actions, edit or accept. If a user submits an edit there is no easy option to get back to review. There needs to be a pathway for users to continue reviewing if they choose.
**Additional context**
In #749 the option to keep in a review track when a transcription is accepted.
**Acceptance criteria**
- [ ] When editing and clicking submit, users are able to return to review or transcribe new page within modal
- [ ] Update modal to include options to transcribe new page or review a page
<img width="592" alt="screen shot 2019-01-08 at 4 06 08 pm" src="https://user-images.githubusercontent.com/7362915/50858773-5fb48e00-135f-11e9-9dfd-432e604f21e5.png">
|
process
|
registered users can return back to review after submitting edits is your feature request related to a problem please describe review is two actions edit or accept if a user submits an edit there is no easy option to get back to review there needs to be a pathway for users to continue reviewing if they choose additional context in the option to keep in a review track when a transcription is accepted acceptance criteria when editing and clicking submit users are able to return to review or transcribe new page within modal update modal to include options to transcribe new page or review a page img width alt screen shot at pm src
| 1
|
705,279
| 24,229,251,015
|
IssuesEvent
|
2022-09-26 16:44:39
|
FrozenBlock/WilderWild
|
https://api.github.com/repos/FrozenBlock/WilderWild
|
closed
|
Fix Jellyfish Caves Placement
|
worldgen high priority awaiting response
|
They should be much smaller on average and generate ONLY under ocean biomes (but only sometimes generate at all)
|
1.0
|
Fix Jellyfish Caves Placement - They should be much smaller on average and generate ONLY under ocean biomes (but only sometimes generate at all)
|
non_process
|
fix jellyfish caves placement they should be much smaller on average and generate only under ocean biomes but only sometimes generate at all
| 0
|
11,405
| 14,238,596,802
|
IssuesEvent
|
2020-11-18 18:52:00
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
'growth of symbiont within host' has double inheritance
|
multi-species process
|
This class currently has ‘symbiotic process’ and ‘growth’ as parent classes. I would recommend either making it a subclass of 'interaction with host' (see Issue #19982 ) or removing the **subClassOf 'symbiotic process'** axiom since this process does not directly involve the interaction between the host and the symbiont.
|
1.0
|
'growth of symbiont within host' has double inheritance - This class currently has ‘symbiotic process’ and ‘growth’ as parent classes. I would recommend either making it a subclass of 'interaction with host' (see Issue #19982 ) or removing the **subClassOf 'symbiotic process'** axiom since this process does not directly involve the interaction between the host and the symbiont.
|
process
|
growth of symbiont within host has double inheritance this class currently has ‘symbiotic process’ and ‘growth’ as parent classes i would recommend either making it a subclass of interaction with host see issue or removing the subclassof symbiotic process axiom since this process does not directly involve the interaction between the host and the symbiont
| 1
|
195,449
| 22,339,637,319
|
IssuesEvent
|
2022-06-14 22:33:56
|
vincenzodistasio97/events-manager-io
|
https://api.github.com/repos/vincenzodistasio97/events-manager-io
|
closed
|
CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz - autoclosed
|
security vulnerability
|
## CVE-2021-3918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary>
<p>JSON Schema validation and specifications</p>
<p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/bcrypt/node_modules/json-schema/package.json</p>
<p>
Dependency Hierarchy:
- babel-cli-6.26.0.tgz (Root Library)
- chokidar-1.7.0.tgz
- fsevents-1.1.3.tgz
- node-pre-gyp-0.6.39.tgz
- request-2.81.0.tgz
- http-signature-1.1.1.tgz
- jsprim-1.4.0.tgz
- :x: **json-schema-0.2.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/vincenzodistasio97/events-manager-io/commit/34b4ee7777ec330308085b59cefc667c68e51123">34b4ee7777ec330308085b59cefc667c68e51123</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-11-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3918">https://nvd.nist.gov/vuln/detail/CVE-2021-3918</a></p>
<p>Release Date: 2021-11-13</p>
<p>Fix Resolution (json-schema): 0.4.0</p>
<p>Direct dependency fix Resolution (babel-cli): 7.0.0-alpha.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz - autoclosed - ## CVE-2021-3918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary>
<p>JSON Schema validation and specifications</p>
<p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/bcrypt/node_modules/json-schema/package.json</p>
<p>
Dependency Hierarchy:
- babel-cli-6.26.0.tgz (Root Library)
- chokidar-1.7.0.tgz
- fsevents-1.1.3.tgz
- node-pre-gyp-0.6.39.tgz
- request-2.81.0.tgz
- http-signature-1.1.1.tgz
- jsprim-1.4.0.tgz
- :x: **json-schema-0.2.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/vincenzodistasio97/events-manager-io/commit/34b4ee7777ec330308085b59cefc667c68e51123">34b4ee7777ec330308085b59cefc667c68e51123</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-11-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3918">https://nvd.nist.gov/vuln/detail/CVE-2021-3918</a></p>
<p>Release Date: 2021-11-13</p>
<p>Fix Resolution (json-schema): 0.4.0</p>
<p>Direct dependency fix Resolution (babel-cli): 7.0.0-alpha.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in json schema tgz autoclosed cve high severity vulnerability vulnerable library json schema tgz json schema validation and specifications library home page a href path to dependency file package json path to vulnerable library node modules bcrypt node modules json schema package json dependency hierarchy babel cli tgz root library chokidar tgz fsevents tgz node pre gyp tgz request tgz http signature tgz jsprim tgz x json schema tgz vulnerable library found in head commit a href found in base branch dev vulnerability details json schema is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution json schema direct dependency fix resolution babel cli alpha step up your open source security game with mend
| 0
|
14,461
| 17,567,155,310
|
IssuesEvent
|
2021-08-14 00:35:50
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
Change term labels, parents for some children of GO:0052205 modulation of molecular function in other organism involved in symbiotic interaction
|
multi-species process term name
|
- [x] GO:0039576 suppression by virus of host JAK1 activity -> also a child of ' GO:0039514 suppression by virus of host JAK-STAT cascade'; cannot annotate the annotated proteins to 'JAK1 inhibitor activity' because it is not clear how it inhibits JAK1, so I will change the label to 'suppression by virus of host JAK-STAT cascade via inhibition of JAK1 activity', and remove the 'GO:0052053 negative regulation by symbiont of host catalytic activity' parent.
Same for
- [x] GO:0039562 suppression by virus of host STAT activity - change label to 'suppression by virus of host JAK-STAT cascade via inhibition of STAT activity'
- [x] GO:0039563 suppression by virus of host STAT1 activity
- [x] GO:0039564 suppression by virus of host STAT2 activity
|
1.0
|
Change term labels, parents for some children of GO:0052205 modulation of molecular function in other organism involved in symbiotic interaction - - [x] GO:0039576 suppression by virus of host JAK1 activity -> also a child of ' GO:0039514 suppression by virus of host JAK-STAT cascade'; cannot annotate the annotated proteins to 'JAK1 inhibitor activity' because it is not clear how it inhibits JAK1, so I will change the label to 'suppression by virus of host JAK-STAT cascade via inhibition of JAK1 activity', and remove the 'GO:0052053 negative regulation by symbiont of host catalytic activity' parent.
Same for
- [x] GO:0039562 suppression by virus of host STAT activity - change label to 'suppression by virus of host JAK-STAT cascade via inhibition of STAT activity'
- [x] GO:0039563 suppression by virus of host STAT1 activity
- [x] GO:0039564 suppression by virus of host STAT2 activity
|
process
|
change term labels parents for some children of go modulation of molecular function in other organism involved in symbiotic interaction go suppression by virus of host activity also a child of go suppression by virus of host jak stat cascade cannot annotate the annotated proteins to inhibitor activity because it is not clear how it inhibits so i will change the label to suppression by virus of host jak stat cascade via inhibition of activity and remove the go negative regulation by symbiont of host catalytic activity parent same for go suppression by virus of host stat activity change label to suppression by virus of host jak stat cascade via inhibition of stat activity go suppression by virus of host activity go suppression by virus of host activity
| 1
|
6,471
| 9,546,742,192
|
IssuesEvent
|
2019-05-01 20:50:58
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Department of State: Add messaging when student edits application
|
Apply Process Approved State Dept.
|
Who: Student
What: Provide a message when an application is changed after it is submitted.
Why: We need to make it clear that the student will need to "submit" the application again once it has been edited.
A/C
- When the user clicks the "update" link associated with their application in their dashboard they will see a modal with the following message
You are about to make edits to an application you have already submitted. Follow these steps to resubmit your application:
1. Go to the page you want to edit by using the progress bar at the top of the page or by clicking the **Save and Continue** button on each page.
2. Click **Save and Continue** once you make your change.
3; Click **Save and Continue** on all of the pages following the page you edited (you don't have to **Save and Continue** on any previous pages)
4. Review your application and click **Submit application**.
- The modal will follow the design system
(no mock included)
|
1.0
|
Department of State: Add messaging when student edits application - Who: Student
What: Provide a message when an application is changed after it is submitted.
Why: We need to make it clear that the student will need to "submit" the application again once it has been edited.
A/C
- When the user clicks the "update" link associated with their application in their dashboard they will see a modal with the following message
You are about to make edits to an application you have already submitted. Follow these steps to resubmit your application:
1. Go to the page you want to edit by using the progress bar at the top of the page or by clicking the **Save and Continue** button on each page.
2. Click **Save and Continue** once you make your change.
3; Click **Save and Continue** on all of the pages following the page you edited (you don't have to **Save and Continue** on any previous pages)
4. Review your application and click **Submit application**.
- The modal will follow the design system
(no mock included)
|
process
|
department of state add messaging when student edits application who student what provide a message when an application is changed after it is submitted why we need to make it clear that the student will need to submit the application again once it has been edited a c when the user clicks the update link associated with their application in their dashboard they will see a modal with the following message you are about to make edits to an application you have already submitted follow these steps to resubmit your application go to the page you want to edit by using the progress bar at the top of the page or by clicking the save and continue button on each page click save and continue once you make your change click save and continue on all of the pages following the page you edited you don t have to save and continue on any previous pages review your application and click submit application the modal will follow the design system no mock included
| 1
|
2,557
| 5,313,714,012
|
IssuesEvent
|
2017-02-13 13:05:50
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
JSON: incorrect printing of expressions for parameter initializations.
|
bug julia preprocessor
|
When parameter initializations contains expressions, symbols are replaced by references to `M_.params` instead of symbol name.
For instance, in `example1.mod` if we replace line `rho = 0.95;` by `rho = 0.95+alpha*0.00001;` we currently get:
```
{"statementName": "param_init", "name": "rho", "value": "0.95+M_.params(3)*0.00001"}
```
instead of:
```
{"statementName": "param_init", "name": "rho", "value": "0.95+alpha*0.00001"}
```
It works correctly for variable initializations.
|
1.0
|
JSON: incorrect printing of expressions for parameter initializations. - When parameter initializations contains expressions, symbols are replaced by references to `M_.params` instead of symbol name.
For instance, in `example1.mod` if we replace line `rho = 0.95;` by `rho = 0.95+alpha*0.00001;` we currently get:
```
{"statementName": "param_init", "name": "rho", "value": "0.95+M_.params(3)*0.00001"}
```
instead of:
```
{"statementName": "param_init", "name": "rho", "value": "0.95+alpha*0.00001"}
```
It works correctly for variable initializations.
|
process
|
json incorrect printing of expressions for parameter initializations when parameter initializations contains expressions symbols are replaced by references to m params instead of symbol name for instance in mod if we replace line rho by rho alpha we currently get statementname param init name rho value m params instead of statementname param init name rho value alpha it works correctly for variable initializations
| 1
|
7,283
| 10,434,600,638
|
IssuesEvent
|
2019-09-17 15:31:42
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
QGIS raster calculator: better handling of optional parameters and error message
|
Bug Feedback Processing
|
Author Name: **Giovanni Manghi** (@gioman)
Original Redmine Issue: [19553](https://issues.qgis.org/issues/19553)
Affected QGIS version: 3.7(master)
Redmine category:processing/qgis
---
See attachment.
Tested on QGIS master on Linux.
I guess the "reference layer(s)" option should be made mandatory or the error must be caught.
---
- [Screenshot_20180806_181109.png](https://issues.qgis.org/attachments/download/13074/Screenshot_20180806_181109.png) (Giovanni Manghi)
|
1.0
|
QGIS raster calculator: better handling of optional parameters and error message - Author Name: **Giovanni Manghi** (@gioman)
Original Redmine Issue: [19553](https://issues.qgis.org/issues/19553)
Affected QGIS version: 3.7(master)
Redmine category:processing/qgis
---
See attachment.
Tested on QGIS master on Linux.
I guess the "reference layer(s)" option should be made mandatory or the error must be caught.
---
- [Screenshot_20180806_181109.png](https://issues.qgis.org/attachments/download/13074/Screenshot_20180806_181109.png) (Giovanni Manghi)
|
process
|
qgis raster calculator better handling of optional parameters and error message author name giovanni manghi gioman original redmine issue affected qgis version master redmine category processing qgis see attachment tested on qgis master on linux i guess the reference layer s option should be made mandatory or the error must be caught giovanni manghi
| 1
|
76,696
| 7,543,688,325
|
IssuesEvent
|
2018-04-17 16:09:38
|
EyeSeeTea/QAApp
|
https://api.github.com/repos/EyeSeeTea/QAApp
|
closed
|
Assess, Context pop-up
|
complexity - low (1hr) priority - high testing type - cosmetic (layout)
|
In case the % of mandatory questions is not 100%, users shouldn't be able to mark this as complete, so this option should be greyed out.

|
1.0
|
Assess, Context pop-up - In case the % of mandatory questions is not 100%, users shouldn't be able to mark this as complete, so this option should be greyed out.

|
non_process
|
assess context pop up in case the of mandatory questions is not users shouldn t be able to mark this as complete so this option should be greyed out
| 0
|
21,152
| 3,462,774,478
|
IssuesEvent
|
2015-12-21 03:49:07
|
pentoo/pentoo-historical
|
https://api.github.com/repos/pentoo/pentoo-historical
|
closed
|
upgrade compat-drivers
|
auto-migrated Priority-Low Type-Defect
|
```
update grsec patches for compat-drivers 3.8_rc5
The livecd build is waiting on you ;-)
```
Original issue reported on code.google.com by `sidhayn` on 4 Feb 2013 at 5:50
|
1.0
|
upgrade compat-drivers - ```
update grsec patches for compat-drivers 3.8_rc5
The livecd build is waiting on you ;-)
```
Original issue reported on code.google.com by `sidhayn` on 4 Feb 2013 at 5:50
|
non_process
|
upgrade compat drivers update grsec patches for compat drivers the livecd build is waiting on you original issue reported on code google com by sidhayn on feb at
| 0
|
88,361
| 25,383,406,103
|
IssuesEvent
|
2022-11-21 19:34:49
|
benthevining/Limes
|
https://api.github.com/repos/benthevining/Limes
|
opened
|
Refactor CMake code
|
enhancement Build
|
- [ ] We should make a function `limes_init_library()` instead of using interface targets from Oranges
- [ ] Try to move as many dev/CI-specific settings out of the project definition (and out of Oranges)
- [ ] Function like `add_build_as_test()` for configuring, building, testing a child cmake project. Use fixtures so that build won't be run if config fails.
|
1.0
|
Refactor CMake code - - [ ] We should make a function `limes_init_library()` instead of using interface targets from Oranges
- [ ] Try to move as many dev/CI-specific settings out of the project definition (and out of Oranges)
- [ ] Function like `add_build_as_test()` for configuring, building, testing a child cmake project. Use fixtures so that build won't be run if config fails.
|
non_process
|
refactor cmake code we should make a function limes init library instead of using interface targets from oranges try to move as many dev ci specific settings out of the project definition and out of oranges function like add build as test for configuring building testing a child cmake project use fixtures so that build won t be run if config fails
| 0
|
150,873
| 23,726,233,638
|
IssuesEvent
|
2022-08-30 19:53:58
|
nextcloud/mail
|
https://api.github.com/repos/nextcloud/mail
|
closed
|
ThreadEnvelope buttons overflow horizontally
|
bug 1. to develop design regression
|
### Steps to reproduce
1. Have a reasonably small horizontal resolution (e.g. portrait mode).
2. Open a thread.
3. Observe the header of an envelope.
### Expected behavior
The header should be truncated or collapsed.
### Actual behavior
The buttons on the right overflow the container of the envelope. Have a look at the three dot menu.

### Mail app version
main
|
1.0
|
ThreadEnvelope buttons overflow horizontally - ### Steps to reproduce
1. Have a reasonably small horizontal resolution (e.g. portrait mode).
2. Open a thread.
3. Observe the header of an envelope.
### Expected behavior
The header should be truncated or collapsed.
### Actual behavior
The buttons on the right overflow the container of the envelope. Have a look at the three dot menu.

### Mail app version
main
|
non_process
|
threadenvelope buttons overflow horizontally steps to reproduce have a reasonably small horizontal resolution e g portrait mode open a thread observe the header of an envelope expected behavior the header should be truncated or collapsed actual behavior the buttons on the right overflow the container of the envelope have a look at the three dot menu mail app version main
| 0
|
150,347
| 13,339,700,421
|
IssuesEvent
|
2020-08-28 13:19:25
|
py-suruga/pycon-jp-2020-tutorial
|
https://api.github.com/repos/py-suruga/pycon-jp-2020-tutorial
|
closed
|
チュートリアルで使うスライドを用意する
|
documentation
|
基本的にドキュメントを見ながら作業だけど、冒頭のコミュニティの紹介や何をやるかのブリーフィング用のスライドを数枚作る
- 準備情報
- 準備資料のURLをZoomで共有
本会
- コミュニティの紹介
- 講師,TA紹介
- 本日のやること
- 大ざっぱなスケジュール
- お菓子用意しよう
|
1.0
|
チュートリアルで使うスライドを用意する - 基本的にドキュメントを見ながら作業だけど、冒頭のコミュニティの紹介や何をやるかのブリーフィング用のスライドを数枚作る
- 準備情報
- 準備資料のURLをZoomで共有
本会
- コミュニティの紹介
- 講師,TA紹介
- 本日のやること
- 大ざっぱなスケジュール
- お菓子用意しよう
|
non_process
|
チュートリアルで使うスライドを用意する 基本的にドキュメントを見ながら作業だけど、冒頭のコミュニティの紹介や何をやるかのブリーフィング用のスライドを数枚作る 準備情報 準備資料のurlをzoomで共有 本会 コミュニティの紹介 講師 ta紹介 本日のやること 大ざっぱなスケジュール お菓子用意しよう
| 0
|
5,747
| 7,324,695,064
|
IssuesEvent
|
2018-03-03 00:13:25
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
well good
|
app-service-web kudos triaged
|
thanks
---
#### Document details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: a825f031-250e-034d-e552-4b0d976d88b1
* Version Independent ID: 626d7451-0b8e-051e-e512-0bf90be319a0
* [Content](https://docs.microsoft.com/en-gb/azure/app-service/app-service-web-get-started-html)
* [Content Source](https://github.com/MicrosoftDocs/azure-docs-pr/blob/live/articles/app-service/app-service-web-get-started-html.md)
* Service: app-service-web
|
1.0
|
well good - thanks
---
#### Document details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: a825f031-250e-034d-e552-4b0d976d88b1
* Version Independent ID: 626d7451-0b8e-051e-e512-0bf90be319a0
* [Content](https://docs.microsoft.com/en-gb/azure/app-service/app-service-web-get-started-html)
* [Content Source](https://github.com/MicrosoftDocs/azure-docs-pr/blob/live/articles/app-service/app-service-web-get-started-html.md)
* Service: app-service-web
|
non_process
|
well good thanks document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id service app service web
| 0
|
2,680
| 5,523,213,121
|
IssuesEvent
|
2017-03-20 05:10:33
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart failed in CI
|
area-System.ServiceProcess blocking-clean-ci test-run-core
|
Failed test: System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart
Configuration: OuterLoop_Windows10_release ([build#130](https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_win10_release/130/testReport/))
Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_win10_release/130/consoleText
Message:
~~~
System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart [FAIL]
System.ServiceProcess.TimeoutException : Time out has expired and the operation has not been completed.
~~~
Stack Trace:
~~~
D:\j\workspace\outerloop_win---6835b088\src\System.ServiceProcess.ServiceController\src\System\ServiceProcess\ServiceController.cs(849,0): at System.ServiceProcess.ServiceController.WaitForStatus(ServiceControllerStatus desiredStatus, TimeSpan timeout)
D:\j\workspace\outerloop_win---6835b088\src\System.ServiceProcess.ServiceController\tests\System.ServiceProcess.ServiceController.Tests\ServiceControllerTests.cs(182,0): at System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart()
~~~
|
1.0
|
System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart failed in CI - Failed test: System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart
Configuration: OuterLoop_Windows10_release ([build#130](https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_win10_release/130/testReport/))
Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_win10_release/130/consoleText
Message:
~~~
System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart [FAIL]
System.ServiceProcess.TimeoutException : Time out has expired and the operation has not been completed.
~~~
Stack Trace:
~~~
D:\j\workspace\outerloop_win---6835b088\src\System.ServiceProcess.ServiceController\src\System\ServiceProcess\ServiceController.cs(849,0): at System.ServiceProcess.ServiceController.WaitForStatus(ServiceControllerStatus desiredStatus, TimeSpan timeout)
D:\j\workspace\outerloop_win---6835b088\src\System.ServiceProcess.ServiceController\tests\System.ServiceProcess.ServiceController.Tests\ServiceControllerTests.cs(182,0): at System.ServiceProcess.Tests.ServiceControllerTests.StopAndStart()
~~~
|
process
|
system serviceprocess tests servicecontrollertests stopandstart failed in ci failed test system serviceprocess tests servicecontrollertests stopandstart configuration outerloop release detail message system serviceprocess tests servicecontrollertests stopandstart system serviceprocess timeoutexception time out has expired and the operation has not been completed stack trace d j workspace outerloop win src system serviceprocess servicecontroller src system serviceprocess servicecontroller cs at system serviceprocess servicecontroller waitforstatus servicecontrollerstatus desiredstatus timespan timeout d j workspace outerloop win src system serviceprocess servicecontroller tests system serviceprocess servicecontroller tests servicecontrollertests cs at system serviceprocess tests servicecontrollertests stopandstart
| 1
|
25,122
| 5,133,234,742
|
IssuesEvent
|
2017-01-11 02:30:00
|
MST-MRDT/Deprecated-Robotic-Arm
|
https://api.github.com/repos/MST-MRDT/Deprecated-Robotic-Arm
|
closed
|
Test Processor
|
Documentation
|
Ensure that the processor will be able to handle the calculations being made by the Arm
|
1.0
|
Test Processor - Ensure that the processor will be able to handle the calculations being made by the Arm
|
non_process
|
test processor ensure that the processor will be able to handle the calculations being made by the arm
| 0
|
21,703
| 30,199,066,613
|
IssuesEvent
|
2023-07-05 02:42:10
|
opensearch-project/data-prepper
|
https://api.github.com/repos/opensearch-project/data-prepper
|
closed
|
Define multiple keys for type conversion
|
enhancement good first issue plugin - processor
|
**Is your feature request related to a problem? Please describe.**
The `convert_entry_type` may be used a few times with the same configuration. It might be nice to consolidate this into a single `keys` value.
For example, see this sample for VPC Flow Logs:
```
processor:
- convert_entry_type:
key: dstport
type: integer
- convert_entry_type:
key: srcport
type: integer
- convert_entry_type:
key: protocol
type: integer
- convert_entry_type:
key: bytes
type: integer
- convert_entry_type:
key: packets
type: integer
```
**Describe the solution you'd like**
Allow defining multiple `keys` in the `convert_entry_type` processor.
Each of the `convert_entry_type` processors could be combined:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
```
**Additional context**
This also becomes much more helpful when combined with #2822. I have the following sample:
```
processor:
- delete_entries:
with_keys:
- dstport
delete_when: '/dstport == "-"'
- convert_entry_type:
key: dstport
type: integer
- delete_entries:
with_keys:
- srcport
delete_when: '/srcport == "-"'
- convert_entry_type:
key: srcport
type: integer
- delete_entries:
with_keys:
- protocol
delete_when: '/protocol == "-"'
- convert_entry_type:
key: protocol
type: integer
- delete_entries:
with_keys:
- bytes
delete_when: '/bytes == "-"'
- convert_entry_type:
key: bytes
type: integer
- delete_entries:
with_keys:
- packets
delete_when: '/packets == "-"'
- convert_entry_type:
key: packets
type: integer
```
This could simply become:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
null_values: [ '-' ]
```
|
1.0
|
Define multiple keys for type conversion - **Is your feature request related to a problem? Please describe.**
The `convert_entry_type` may be used a few times with the same configuration. It might be nice to consolidate this into a single `keys` value.
For example, see this sample for VPC Flow Logs:
```
processor:
- convert_entry_type:
key: dstport
type: integer
- convert_entry_type:
key: srcport
type: integer
- convert_entry_type:
key: protocol
type: integer
- convert_entry_type:
key: bytes
type: integer
- convert_entry_type:
key: packets
type: integer
```
**Describe the solution you'd like**
Allow defining multiple `keys` in the `convert_entry_type` processor.
Each of the `convert_entry_type` processors could be combined:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
```
**Additional context**
This also becomes much more helpful when combined with #2822. I have the following sample:
```
processor:
- delete_entries:
with_keys:
- dstport
delete_when: '/dstport == "-"'
- convert_entry_type:
key: dstport
type: integer
- delete_entries:
with_keys:
- srcport
delete_when: '/srcport == "-"'
- convert_entry_type:
key: srcport
type: integer
- delete_entries:
with_keys:
- protocol
delete_when: '/protocol == "-"'
- convert_entry_type:
key: protocol
type: integer
- delete_entries:
with_keys:
- bytes
delete_when: '/bytes == "-"'
- convert_entry_type:
key: bytes
type: integer
- delete_entries:
with_keys:
- packets
delete_when: '/packets == "-"'
- convert_entry_type:
key: packets
type: integer
```
This could simply become:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
null_values: [ '-' ]
```
|
process
|
define multiple keys for type conversion is your feature request related to a problem please describe the convert entry type may be used a few times with the same configuration it might be nice to consolidate this into a single keys value for example see this sample for vpc flow logs processor convert entry type key dstport type integer convert entry type key srcport type integer convert entry type key protocol type integer convert entry type key bytes type integer convert entry type key packets type integer describe the solution you d like allow defining multiple keys in the convert entry type processor each of the convert entry type processors could be combined processor convert entry type keys type integer additional context this also becomes much more helpful when combined with i have the following sample processor delete entries with keys dstport delete when dstport convert entry type key dstport type integer delete entries with keys srcport delete when srcport convert entry type key srcport type integer delete entries with keys protocol delete when protocol convert entry type key protocol type integer delete entries with keys bytes delete when bytes convert entry type key bytes type integer delete entries with keys packets delete when packets convert entry type key packets type integer this could simply become processor convert entry type keys type integer null values
| 1
|
16,443
| 21,317,165,237
|
IssuesEvent
|
2022-04-16 13:35:42
|
googlemaps/android-samples
|
https://api.github.com/repos/googlemaps/android-samples
|
closed
|
snippet-bot full scan
|
type: process stale
|
Check validity of existing region tags<!-- probot comment [11337970]-->
## snippet-bot scan result
Life is too short to manually check unmatched region tags.
Here is the result:
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt:48](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt#L48), tag `maps_android_ground_overlays_remove` doesn't have a matching start tag
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt:50](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt#L50), tag `maps_android_ground_overlays_change_image` has already started
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/TileOverlays.kt:87](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/TileOverlays.kt#L87), tag `maps_android_tile_overlays_transparency` doesn't have a matching start tag
---
Report generated by [snippet-bot](https://github.com/apps/snippet-bot).
If you find problems with this result, please file an issue at:
https://github.com/googleapis/repo-automation-bots/issues.
|
1.0
|
snippet-bot full scan - Check validity of existing region tags<!-- probot comment [11337970]-->
## snippet-bot scan result
Life is too short to manually check unmatched region tags.
Here is the result:
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt:48](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt#L48), tag `maps_android_ground_overlays_remove` doesn't have a matching start tag
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt:50](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/GroundOverlays.kt#L50), tag `maps_android_ground_overlays_change_image` has already started
- [ ] [snippets/app/src/gms/java/com/google/maps/example/kotlin/TileOverlays.kt:87](https://github.com/googlemaps/android-samples/blob/b9d2916/snippets/app/src/gms/java/com/google/maps/example/kotlin/TileOverlays.kt#L87), tag `maps_android_tile_overlays_transparency` doesn't have a matching start tag
---
Report generated by [snippet-bot](https://github.com/apps/snippet-bot).
If you find problems with this result, please file an issue at:
https://github.com/googleapis/repo-automation-bots/issues.
|
process
|
snippet bot full scan check validity of existing region tags snippet bot scan result life is too short to manually check unmatched region tags here is the result tag maps android ground overlays remove doesn t have a matching start tag tag maps android ground overlays change image has already started tag maps android tile overlays transparency doesn t have a matching start tag report generated by if you find problems with this result please file an issue at
| 1
|
4,034
| 6,971,516,031
|
IssuesEvent
|
2017-12-11 14:17:21
|
w3c/html
|
https://api.github.com/repos/w3c/html
|
closed
|
CFC: Publish FPWD of HTML5.3
|
process
|
This is a Call For Consensus (CFC) to publish the First Public Working Draft (FPWD) of HTML5.3, based on the current [Editor's Draft](https://w3c.github.io/html).
Substantive changes since HTML5.2 are noted in the [Changes section](http://w3c.github.io/html/changes.html#changes). Note that discussions about the `autocapitalize` attribute are ongoing. Details of all commits can be found in the [Github commit log](https://github.com/w3c/html/commits).
Please respond to this CFC by the end of day on Sunday 10th December. To support the proposal, add a "thumbs up" to this comment. If you don't support
the proposal, add a "thumbs down" and post your reasons in a comment.
If you choose not to respond it will be taken as silent support for the proposal. Actual responses are preferred however.
|
1.0
|
CFC: Publish FPWD of HTML5.3 - This is a Call For Consensus (CFC) to publish the First Public Working Draft (FPWD) of HTML5.3, based on the current [Editor's Draft](https://w3c.github.io/html).
Substantive changes since HTML5.2 are noted in the [Changes section](http://w3c.github.io/html/changes.html#changes). Note that discussions about the `autocapitalize` attribute are ongoing. Details of all commits can be found in the [Github commit log](https://github.com/w3c/html/commits).
Please respond to this CFC by the end of day on Sunday 10th December. To support the proposal, add a "thumbs up" to this comment. If you don't support
the proposal, add a "thumbs down" and post your reasons in a comment.
If you choose not to respond it will be taken as silent support for the proposal. Actual responses are preferred however.
|
process
|
cfc publish fpwd of this is a call for consensus cfc to publish the first public working draft fpwd of based on the current substantive changes since are noted in the note that discussions about the autocapitalize attribute are ongoing details of all commits can be found in the please respond to this cfc by the end of day on sunday december to support the proposal add a thumbs up to this comment if you don t support the proposal add a thumbs down and post your reasons in a comment if you choose not to respond it will be taken as silent support for the proposal actual responses are preferred however
| 1
|
167,091
| 20,725,782,436
|
IssuesEvent
|
2022-03-14 01:33:46
|
rzr/rzr-example
|
https://api.github.com/repos/rzr/rzr-example
|
opened
|
CVE-2020-28500 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-1.0.2.tgz</b>, <b>lodash-3.10.1.tgz</b>, <b>lodash-3.7.0.tgz</b>, <b>lodash-2.4.2.tgz</b>, <b>lodash-0.9.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-1.0.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, and extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz">https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/globule/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-0.6.1.tgz (Root Library)
- gaze-0.5.2.tgz
- globule-0.1.0.tgz
- :x: **lodash-1.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/grunt-contrib-uglify/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-uglify-0.9.2.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.7.0.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.7.0.tgz">https://registry.npmjs.org/lodash/-/lodash-3.7.0.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/jshint/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-jshint-0.11.3.tgz (Root Library)
- jshint-2.8.0.tgz
- :x: **lodash-3.7.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/plugin/multiplex/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/lodash/package.json,/reveal.js-master/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-0.6.1.tgz (Root Library)
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-0.9.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, and extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-0.9.2.tgz">https://registry.npmjs.org/lodash/-/lodash-0.9.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/grunt-legacy-util/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- :x: **lodash-0.9.2.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
WhiteSource Note: After conducting further research, WhiteSource has determined that CVE-2020-28500 only affects environments with versions 4.0.0 to 4.17.20 of Lodash.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-watch): 1.0.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-uglify): 0.10.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-jshint): 0.12.0</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-watch): 1.0.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt): 1.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-28500 (Medium) detected in multiple libraries - ## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-1.0.2.tgz</b>, <b>lodash-3.10.1.tgz</b>, <b>lodash-3.7.0.tgz</b>, <b>lodash-2.4.2.tgz</b>, <b>lodash-0.9.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-1.0.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, and extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz">https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/globule/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-0.6.1.tgz (Root Library)
- gaze-0.5.2.tgz
- globule-0.1.0.tgz
- :x: **lodash-1.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/grunt-contrib-uglify/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-uglify-0.9.2.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.7.0.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.7.0.tgz">https://registry.npmjs.org/lodash/-/lodash-3.7.0.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/jshint/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-jshint-0.11.3.tgz (Root Library)
- jshint-2.8.0.tgz
- :x: **lodash-3.7.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/plugin/multiplex/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/lodash/package.json,/reveal.js-master/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-0.6.1.tgz (Root Library)
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-0.9.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, and extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-0.9.2.tgz">https://registry.npmjs.org/lodash/-/lodash-0.9.2.tgz</a></p>
<p>Path to dependency file: /reveal.js-master/package.json</p>
<p>Path to vulnerable library: /reveal.js-master/node_modules/grunt-legacy-util/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-0.4.5.tgz (Root Library)
- :x: **lodash-0.9.2.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
WhiteSource Note: After conducting further research, WhiteSource has determined that CVE-2020-28500 only affects environments with versions 4.0.0 to 4.17.20 of Lodash.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-watch): 1.0.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-uglify): 0.10.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-jshint): 0.12.0</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt-contrib-watch): 1.0.1</p><p>Fix Resolution (lodash): 4.17.21</p>
<p>Direct dependency fix Resolution (grunt): 1.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash tgz lodash tgz lodash tgz a utility library delivering consistency customization performance and extras library home page a href path to dependency file reveal js master package json path to vulnerable library reveal js master node modules globule node modules lodash package json dependency hierarchy grunt contrib watch tgz root library gaze tgz globule tgz x lodash tgz vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file reveal js master package json path to vulnerable library reveal js master node modules grunt contrib uglify node modules lodash package json dependency hierarchy grunt contrib uglify tgz root library x lodash tgz vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file reveal js master package json path to vulnerable library reveal js master node modules jshint node modules lodash package json dependency hierarchy grunt contrib jshint tgz root library jshint tgz x lodash tgz vulnerable library lodash tgz a utility library delivering consistency customization performance extras library home page a href path to dependency file reveal js master plugin multiplex package json path to vulnerable library reveal js master node modules lodash package json reveal js master node modules lodash package json dependency hierarchy grunt contrib watch tgz root library x lodash tgz vulnerable library lodash tgz a utility library delivering consistency customization performance and extras library home page a href path to dependency file reveal js master package json path to vulnerable library reveal js master node modules grunt legacy util node modules lodash package json dependency hierarchy grunt tgz root library x lodash tgz vulnerable library found in base branch master vulnerability details lodash versions prior to are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions whitesource note after conducting further research whitesource has determined that cve only affects environments with versions to of lodash publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash direct dependency fix resolution grunt contrib watch fix resolution lodash direct dependency fix resolution grunt contrib uglify fix resolution lodash direct dependency fix resolution grunt contrib jshint fix resolution lodash direct dependency fix resolution grunt contrib watch fix resolution lodash direct dependency fix resolution grunt step up your open source security game with whitesource
| 0
|
18,956
| 6,660,893,760
|
IssuesEvent
|
2017-10-02 04:48:53
|
Amber-MD/cmake-buildscripts
|
https://api.github.com/repos/Amber-MD/cmake-buildscripts
|
reopened
|
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type
|
build error
|
I swear that I am not able compile cmake stuff successfully on my Mac and casegroup recently.
Only travis is working and you know it's painful to debug.
btw, please check below message and also check attached file (need to rename it to `report.tar`)
```
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type
/Users/haichit/amber_git/amber/AmberTools/src/rism/amber_rism_interface.F90:1234:26:
atomPositions_fce => safemem_realloc(atomPositions_fce, 3, rism_3d%solute%numAtoms)
1
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type
make[2]: *** [AmberTools/src/rism/CMakeFiles/sander_rism_interface.dir/amber_rism_interface.F90.o] Error 1
make[1]: *** [AmberTools/src/rism/CMakeFiles/sander_rism_interface.dir/all] Error 2
make: *** [all] Error 2
```
[report.txt](https://github.com/Amber-MD/cmake-buildscripts/files/1331319/report.txt)
|
1.0
|
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type - I swear that I am not able compile cmake stuff successfully on my Mac and casegroup recently.
Only travis is working and you know it's painful to debug.
btw, please check below message and also check attached file (need to rename it to `report.tar`)
```
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type
/Users/haichit/amber_git/amber/AmberTools/src/rism/amber_rism_interface.F90:1234:26:
atomPositions_fce => safemem_realloc(atomPositions_fce, 3, rism_3d%solute%numAtoms)
1
Error: Symbol ‘atompositions_fce’ at (1) has no IMPLICIT type
make[2]: *** [AmberTools/src/rism/CMakeFiles/sander_rism_interface.dir/amber_rism_interface.F90.o] Error 1
make[1]: *** [AmberTools/src/rism/CMakeFiles/sander_rism_interface.dir/all] Error 2
make: *** [all] Error 2
```
[report.txt](https://github.com/Amber-MD/cmake-buildscripts/files/1331319/report.txt)
|
non_process
|
error symbol ‘atompositions fce’ at has no implicit type i swear that i am not able compile cmake stuff successfully on my mac and casegroup recently only travis is working and you know it s painful to debug btw please check below message and also check attached file need to rename it to report tar error symbol ‘atompositions fce’ at has no implicit type users haichit amber git amber ambertools src rism amber rism interface atompositions fce safemem realloc atompositions fce rism solute numatoms error symbol ‘atompositions fce’ at has no implicit type make error make error make error
| 0
|
12,001
| 14,738,137,144
|
IssuesEvent
|
2021-01-07 03:52:00
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Staged fees & billing cycle for SAHosted
|
anc-core anc-ops anc-process anp-important ant-bug ant-support
|
In GitLab by @kdjstudios on May 9, 2018, 09:30
**Submitted by:** "Michelle McKee" <michelle.mckee@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-09-98864/conversation
**Server:** Internal (Both)
**Client/Site:** SA Hosted Clients
**Account:** NA
**Issue:**
I am trying to do my 5/1/18 billing cycle for SAHosted clients but when I click on the Staged Fees button the screen says “Staged fees and payments will be applied in 06/01/2018 Master billing cycle”. This should say “will be applied in 5/1/18 Master billing cycle.
The only thing I have done is to create a draft invoice for one of the clients and that draft has a 6/1/18 invoice date on it. It should be a 5/1/18 invoice date but I didn’t notice it until I had printed a copy of the draft. Can this draft invoice be deleted so that I can create one with the correct date?
|
1.0
|
Staged fees & billing cycle for SAHosted - In GitLab by @kdjstudios on May 9, 2018, 09:30
**Submitted by:** "Michelle McKee" <michelle.mckee@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-09-98864/conversation
**Server:** Internal (Both)
**Client/Site:** SA Hosted Clients
**Account:** NA
**Issue:**
I am trying to do my 5/1/18 billing cycle for SAHosted clients but when I click on the Staged Fees button the screen says “Staged fees and payments will be applied in 06/01/2018 Master billing cycle”. This should say “will be applied in 5/1/18 Master billing cycle.
The only thing I have done is to create a draft invoice for one of the clients and that draft has a 6/1/18 invoice date on it. It should be a 5/1/18 invoice date but I didn’t notice it until I had printed a copy of the draft. Can this draft invoice be deleted so that I can create one with the correct date?
|
process
|
staged fees billing cycle for sahosted in gitlab by kdjstudios on may submitted by michelle mckee helpdesk server internal both client site sa hosted clients account na issue i am trying to do my billing cycle for sahosted clients but when i click on the staged fees button the screen says “staged fees and payments will be applied in master billing cycle” this should say “will be applied in master billing cycle the only thing i have done is to create a draft invoice for one of the clients and that draft has a invoice date on it it should be a invoice date but i didn’t notice it until i had printed a copy of the draft can this draft invoice be deleted so that i can create one with the correct date
| 1
|
19,168
| 11,162,468,668
|
IssuesEvent
|
2019-12-26 18:01:18
|
cityofaustin/atd-data-tech
|
https://api.github.com/repos/cityofaustin/atd-data-tech
|
closed
|
Compile custom CSS requirements for VZA Application
|
Impact: 2-Major Need: 1-Must Have Project: VZA App Service: Apps Type: Enhancement Workgroup: VZ
|
Document requirements with https://github.com/cityofaustin/atd-data-tech/issues/966.
- [ ] Need a big button to direct officers back to earlier page after recording violations for that contact.

|
1.0
|
Compile custom CSS requirements for VZA Application - Document requirements with https://github.com/cityofaustin/atd-data-tech/issues/966.
- [ ] Need a big button to direct officers back to earlier page after recording violations for that contact.

|
non_process
|
compile custom css requirements for vza application document requirements with need a big button to direct officers back to earlier page after recording violations for that contact
| 0
|
59,909
| 14,670,391,554
|
IssuesEvent
|
2020-12-30 04:42:12
|
fink/fink-distributions
|
https://api.github.com/repos/fink/fink-distributions
|
closed
|
Dependency Error in coot
|
missing (build) dependency
|
Have an error pop up when running the command "fink install coot"
Error: Can't resolve dependency "gcc5-shlibs (>= 5.1.0-2)" for package "coot-0.8.9-2"
(no matching packages/versions found)
Exiting with failure.
I'm on OS X 10.15.4
|
1.0
|
Dependency Error in coot - Have an error pop up when running the command "fink install coot"
Error: Can't resolve dependency "gcc5-shlibs (>= 5.1.0-2)" for package "coot-0.8.9-2"
(no matching packages/versions found)
Exiting with failure.
I'm on OS X 10.15.4
|
non_process
|
dependency error in coot have an error pop up when running the command fink install coot error can t resolve dependency shlibs for package coot no matching packages versions found exiting with failure i m on os x
| 0
|
8,526
| 11,704,673,351
|
IssuesEvent
|
2020-03-07 11:03:04
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
Color/Spot picker can be used outside picture area, with different unwanted results.
|
bug: pending difficulty: average priority: medium scope: image processing
|
darktable 3.0.0rc0+82~g4c6df92a0
**Describe the bug**
When using the color/spot picker you are able to use it outside the image area. This produces (severe) unwanted results.
Depending on the module you try this with it (three examples):
- freezes/crashes dt (exposure module)
- produces a black (or blue) image (filmic rgb module)
- produces a black screen (WB module with spot)
The last two can be recovered by resetting the module.
**To Reproduce**
Use a 'picker', any picker, and click/select outside the image area.
**Expected behavior**
I expect nothing to happen. Click/select outside the image area should be ignored. As it is in version 2.6
**Platform (please complete the following information):**
- OS: Linux Debian 10.1 (and also confirmed on up-to-date Gentoo)
- Version darktable 3.0.0rc0+82~g4c6df92a0
EDIT: words
|
1.0
|
Color/Spot picker can be used outside picture area, with different unwanted results. - darktable 3.0.0rc0+82~g4c6df92a0
**Describe the bug**
When using the color/spot picker you are able to use it outside the image area. This produces (severe) unwanted results.
Depending on the module you try this with it (three examples):
- freezes/crashes dt (exposure module)
- produces a black (or blue) image (filmic rgb module)
- produces a black screen (WB module with spot)
The last two can be recovered by resetting the module.
**To Reproduce**
Use a 'picker', any picker, and click/select outside the image area.
**Expected behavior**
I expect nothing to happen. Click/select outside the image area should be ignored. As it is in version 2.6
**Platform (please complete the following information):**
- OS: Linux Debian 10.1 (and also confirmed on up-to-date Gentoo)
- Version darktable 3.0.0rc0+82~g4c6df92a0
EDIT: words
|
process
|
color spot picker can be used outside picture area with different unwanted results darktable describe the bug when using the color spot picker you are able to use it outside the image area this produces severe unwanted results depending on the module you try this with it three examples freezes crashes dt exposure module produces a black or blue image filmic rgb module produces a black screen wb module with spot the last two can be recovered by resetting the module to reproduce use a picker any picker and click select outside the image area expected behavior i expect nothing to happen click select outside the image area should be ignored as it is in version platform please complete the following information os linux debian and also confirmed on up to date gentoo version darktable edit words
| 1
|
1,751
| 4,445,773,729
|
IssuesEvent
|
2016-08-20 08:01:06
|
acmcarther/space_coop
|
https://api.github.com/repos/acmcarther/space_coop
|
opened
|
Add a post-test jenkins hook for dropping a comment indicating the current dependency order
|
medium P1 process question
|
The title says most of it. It might be useful to peer reviewers to see the diff between the dependency ordering in [master] and the dependency order in [head].
Perhaps we could broadcast to all open prs a message asking for a sync if their dependencies are now in a different order?
|
1.0
|
Add a post-test jenkins hook for dropping a comment indicating the current dependency order - The title says most of it. It might be useful to peer reviewers to see the diff between the dependency ordering in [master] and the dependency order in [head].
Perhaps we could broadcast to all open prs a message asking for a sync if their dependencies are now in a different order?
|
process
|
add a post test jenkins hook for dropping a comment indicating the current dependency order the title says most of it it might be useful to peer reviewers to see the diff between the dependency ordering in and the dependency order in perhaps we could broadcast to all open prs a message asking for a sync if their dependencies are now in a different order
| 1
|
2,079
| 4,893,326,540
|
IssuesEvent
|
2016-11-18 22:43:54
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process documentation for stdio
|
child_process doc
|
The [child_process documentation](https://github.com/nodejs/node/blob/master/doc/api/child_process.md) for `execFileSync`, `execSync` and `spawnSync` notes that `options.stdio` is an array.
But it seems it can also be a string, similar to the `stdio` options given to `child_process.spawn`.
Actually `execFileSync`, `execSync` and `spawnSync` all ends [up here](https://github.com/nodejs/node/blob/master/lib/child_process.js#L428), where the options are accepted as string or array. See https://github.com/nodejs/node/blob/master/lib/internal/child_process.js#L758
|
1.0
|
child_process documentation for stdio - The [child_process documentation](https://github.com/nodejs/node/blob/master/doc/api/child_process.md) for `execFileSync`, `execSync` and `spawnSync` notes that `options.stdio` is an array.
But it seems it can also be a string, similar to the `stdio` options given to `child_process.spawn`.
Actually `execFileSync`, `execSync` and `spawnSync` all ends [up here](https://github.com/nodejs/node/blob/master/lib/child_process.js#L428), where the options are accepted as string or array. See https://github.com/nodejs/node/blob/master/lib/internal/child_process.js#L758
|
process
|
child process documentation for stdio the for execfilesync execsync and spawnsync notes that options stdio is an array but it seems it can also be a string similar to the stdio options given to child process spawn actually execfilesync execsync and spawnsync all ends where the options are accepted as string or array see
| 1
|
1,006
| 3,471,163,317
|
IssuesEvent
|
2015-12-23 13:44:21
|
refugeetech/platform
|
https://api.github.com/repos/refugeetech/platform
|
closed
|
Set up Git development branch and train team on Git Flow
|
in progress Open Process
|
All development should take place in sub branches of the development branch. The project will use a [git-flow](https://danielkummer.github.io/git-flow-cheatsheet/) convention to name branches and collaborate.
# User story
```
As a developer
I would like to work on the project code in a distributed fashion
so that I can easily maintain my own working branch
```
```
As a developer/project manager
I would like to follow a conventional Git workflow
so that I know we are following common practices
```
# Task
* [x] Set up development branch
* [ ] Train team members on Git Flow
|
1.0
|
Set up Git development branch and train team on Git Flow - All development should take place in sub branches of the development branch. The project will use a [git-flow](https://danielkummer.github.io/git-flow-cheatsheet/) convention to name branches and collaborate.
# User story
```
As a developer
I would like to work on the project code in a distributed fashion
so that I can easily maintain my own working branch
```
```
As a developer/project manager
I would like to follow a conventional Git workflow
so that I know we are following common practices
```
# Task
* [x] Set up development branch
* [ ] Train team members on Git Flow
|
process
|
set up git development branch and train team on git flow all development should take place in sub branches of the development branch the project will use a convention to name branches and collaborate user story as a developer i would like to work on the project code in a distributed fashion so that i can easily maintain my own working branch as a developer project manager i would like to follow a conventional git workflow so that i know we are following common practices task set up development branch train team members on git flow
| 1
|
294,560
| 9,036,941,907
|
IssuesEvent
|
2019-02-09 04:47:59
|
sethballantyne/Game-Demos
|
https://api.github.com/repos/sethballantyne/Game-Demos
|
closed
|
Add the missing horizontal line on the editor grid
|
Plexis-Editor bug priority-high
|

There's supposed to be a line at the bottom there. :P
|
1.0
|
Add the missing horizontal line on the editor grid - 
There's supposed to be a line at the bottom there. :P
|
non_process
|
add the missing horizontal line on the editor grid there s supposed to be a line at the bottom there p
| 0
|
19,932
| 26,399,782,743
|
IssuesEvent
|
2023-01-12 23:31:30
|
joeynmt/joeynmt
|
https://api.github.com/repos/joeynmt/joeynmt
|
closed
|
"AutocastCPU only supports Bfloat16" error when following rnn_reverse tutorial
|
work in process
|
**Describe the issue**
Using a CPU only version of Pytorch.
Attempting to follow the joeynmt tutorial (the rnn_reverse task).
Executing the step:
```python3 -m joeynmt train configs/reverse.yaml```
produces a warning about autocasting only support bfloat16 and then the following error (full warning and error trace bellow):
```RuntimeError: Currently, AutocastCPU only support Bfloat16 as the autocast_cpu_dtype```
**System :**
- OS: Ubuntu 22.04.1 (Running in VirtualBox)
- Python 3.10.6
- Pytorch 1.12.1 (CPU only)
**Steps to reproduce the behavior:**
1. Follow install instructions [https://joeynmt.readthedocs.io/en/latest/install.html](https://joeynmt.readthedocs.io/en/latest/install.html)
2. Follow steps of tutorial [https://joeynmt.readthedocs.io/en/latest/tutorial.html](https://joeynmt.readthedocs.io/en/latest/tutorial.html)
**Logged output**
```
2022-10-27 15:33:46,845 - INFO - joeynmt.training - EPOCH 1
/home/dev/Documents/dev/jmnt/lib.python3.10/site-packages/torch/amp/autocast_mode.py:210: UserWarning: In CPU autocast, but the target dtype is not supported. Disabling autocast.
CPU Autocast only supports dtype of torch.bfloat16 currently.
warnings.warn(error_message)
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/dev/Documents/dev/joeynmt/__main__.py", line 64, in <module>
main()
File "/home/dev/Documents/dev/joeynmt/__main__.py", line 44, in main
train(cfg_file=args.config_path, skip_test=args.skip_test)
File "C:\Users\m6urn\Documents\school\abbreviations-generation\jnmt\lib\site-packages\joeynmt\training.py", line 829, in train
trainer.train_and_validate(train_data=train_data, valid_data=dev_data)
File "/home/dev/Documents/dev/joeynmt/joeynmt/training.py", line 460, in train_and_validate
norm_batch_loss = self._train_step(batch)
File "/home/dev/Documents/dev/joeynmt/joeynmt/training.py", line 573, in _train_step
with torch.autocast(device_type=self.device.type,
File "/home/dev/Documents/dev/jmnt/lib/python3.10/site-packages/torch/amp//autocast_mode.py", line 234, in __enter__
torch.set_autocast_cpu_dtype(self.fast_dtype) # type: ignore[arg-type]
RuntimeError: Currently, AutocastCPU only support Bfloat16 as the autocast_cpu_dtype
```
|
1.0
|
"AutocastCPU only supports Bfloat16" error when following rnn_reverse tutorial - **Describe the issue**
Using a CPU only version of Pytorch.
Attempting to follow the joeynmt tutorial (the rnn_reverse task).
Executing the step:
```python3 -m joeynmt train configs/reverse.yaml```
produces a warning about autocasting only support bfloat16 and then the following error (full warning and error trace bellow):
```RuntimeError: Currently, AutocastCPU only support Bfloat16 as the autocast_cpu_dtype```
**System :**
- OS: Ubuntu 22.04.1 (Running in VirtualBox)
- Python 3.10.6
- Pytorch 1.12.1 (CPU only)
**Steps to reproduce the behavior:**
1. Follow install instructions [https://joeynmt.readthedocs.io/en/latest/install.html](https://joeynmt.readthedocs.io/en/latest/install.html)
2. Follow steps of tutorial [https://joeynmt.readthedocs.io/en/latest/tutorial.html](https://joeynmt.readthedocs.io/en/latest/tutorial.html)
**Logged output**
```
2022-10-27 15:33:46,845 - INFO - joeynmt.training - EPOCH 1
/home/dev/Documents/dev/jmnt/lib.python3.10/site-packages/torch/amp/autocast_mode.py:210: UserWarning: In CPU autocast, but the target dtype is not supported. Disabling autocast.
CPU Autocast only supports dtype of torch.bfloat16 currently.
warnings.warn(error_message)
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/dev/Documents/dev/joeynmt/__main__.py", line 64, in <module>
main()
File "/home/dev/Documents/dev/joeynmt/__main__.py", line 44, in main
train(cfg_file=args.config_path, skip_test=args.skip_test)
File "C:\Users\m6urn\Documents\school\abbreviations-generation\jnmt\lib\site-packages\joeynmt\training.py", line 829, in train
trainer.train_and_validate(train_data=train_data, valid_data=dev_data)
File "/home/dev/Documents/dev/joeynmt/joeynmt/training.py", line 460, in train_and_validate
norm_batch_loss = self._train_step(batch)
File "/home/dev/Documents/dev/joeynmt/joeynmt/training.py", line 573, in _train_step
with torch.autocast(device_type=self.device.type,
File "/home/dev/Documents/dev/jmnt/lib/python3.10/site-packages/torch/amp//autocast_mode.py", line 234, in __enter__
torch.set_autocast_cpu_dtype(self.fast_dtype) # type: ignore[arg-type]
RuntimeError: Currently, AutocastCPU only support Bfloat16 as the autocast_cpu_dtype
```
|
process
|
autocastcpu only supports error when following rnn reverse tutorial describe the issue using a cpu only version of pytorch attempting to follow the joeynmt tutorial the rnn reverse task executing the step m joeynmt train configs reverse yaml produces a warning about autocasting only support and then the following error full warning and error trace bellow runtimeerror currently autocastcpu only support as the autocast cpu dtype system os ubuntu running in virtualbox python pytorch cpu only steps to reproduce the behavior follow install instructions follow steps of tutorial logged output info joeynmt training epoch home dev documents dev jmnt lib site packages torch amp autocast mode py userwarning in cpu autocast but the target dtype is not supported disabling autocast cpu autocast only supports dtype of torch currently warnings warn error message traceback most recent call last file usr lib runpy py line in run module as main return run code code main globals none file usr lib runpy py line in run code exec code run globals file home dev documents dev joeynmt main py line in main file home dev documents dev joeynmt main py line in main train cfg file args config path skip test args skip test file c users documents school abbreviations generation jnmt lib site packages joeynmt training py line in train trainer train and validate train data train data valid data dev data file home dev documents dev joeynmt joeynmt training py line in train and validate norm batch loss self train step batch file home dev documents dev joeynmt joeynmt training py line in train step with torch autocast device type self device type file home dev documents dev jmnt lib site packages torch amp autocast mode py line in enter torch set autocast cpu dtype self fast dtype type ignore runtimeerror currently autocastcpu only support as the autocast cpu dtype
| 1
|
4,215
| 7,177,069,559
|
IssuesEvent
|
2018-01-31 12:19:46
|
twsswt/bug_buddy_jira_plugin
|
https://api.github.com/repos/twsswt/bug_buddy_jira_plugin
|
opened
|
Automate .jar generation for data-gatherer
|
priority:high processs enhancement
|
Generating .jars (binaries) is currently a manual process. It would be nice if it was automated
|
1.0
|
Automate .jar generation for data-gatherer - Generating .jars (binaries) is currently a manual process. It would be nice if it was automated
|
process
|
automate jar generation for data gatherer generating jars binaries is currently a manual process it would be nice if it was automated
| 1
|
34,414
| 6,331,756,372
|
IssuesEvent
|
2017-07-26 10:47:28
|
opensistemas-hub/osbrain
|
https://api.github.com/repos/opensistemas-hub/osbrain
|
closed
|
Closing connections
|
documentation enhancement
|
It is not docummented (we should document it).
@Flood1993 should we add a method (i.e.: `agent.close(self, alias)`) to ease things for the end-user?
|
1.0
|
Closing connections - It is not docummented (we should document it).
@Flood1993 should we add a method (i.e.: `agent.close(self, alias)`) to ease things for the end-user?
|
non_process
|
closing connections it is not docummented we should document it should we add a method i e agent close self alias to ease things for the end user
| 0
|
139,211
| 18,846,340,178
|
IssuesEvent
|
2021-11-11 15:20:31
|
finos/plexus-interop
|
https://api.github.com/repos/finos/plexus-interop
|
opened
|
CVE-2021-23807 (High) detected in jsonpointer-4.0.1.tgz
|
security vulnerability
|
## CVE-2021-23807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonpointer-4.0.1.tgz</b></p></summary>
<p>Simple JSON Addressing.</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz">https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz</a></p>
<p>Path to dependency file: plexus-interop/dsl/interop-lang-vscode/package.json</p>
<p>Path to vulnerable library: plexus-interop/dsl/interop-lang-vscode/node_modules/jsonpointer/package.json</p>
<p>
Dependency Hierarchy:
- vscode-1.1.14.tgz (Root Library)
- gulp-remote-src-0.4.3.tgz
- request-2.79.0.tgz
- har-validator-2.0.6.tgz
- is-my-json-valid-2.17.2.tgz
- :x: **jsonpointer-4.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/finos/plexus-interop/commit/9bb34b17909843691b3584fd99303ec8b5dc6983">9bb34b17909843691b3584fd99303ec8b5dc6983</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.
<p>Publish Date: 2021-11-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23807>CVE-2021-23807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807</a></p>
<p>Release Date: 2021-11-03</p>
<p>Fix Resolution: jsonpointer - 5.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"jsonpointer","packageVersion":"4.0.1","packageFilePaths":["/dsl/interop-lang-vscode/package.json"],"isTransitiveDependency":true,"dependencyTree":"vscode:1.1.14;gulp-remote-src:0.4.3;request:2.79.0;har-validator:2.0.6;is-my-json-valid:2.17.2;jsonpointer:4.0.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jsonpointer - 5.0.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23807","vulnerabilityDetails":"This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23807","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-23807 (High) detected in jsonpointer-4.0.1.tgz - ## CVE-2021-23807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonpointer-4.0.1.tgz</b></p></summary>
<p>Simple JSON Addressing.</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz">https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz</a></p>
<p>Path to dependency file: plexus-interop/dsl/interop-lang-vscode/package.json</p>
<p>Path to vulnerable library: plexus-interop/dsl/interop-lang-vscode/node_modules/jsonpointer/package.json</p>
<p>
Dependency Hierarchy:
- vscode-1.1.14.tgz (Root Library)
- gulp-remote-src-0.4.3.tgz
- request-2.79.0.tgz
- har-validator-2.0.6.tgz
- is-my-json-valid-2.17.2.tgz
- :x: **jsonpointer-4.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/finos/plexus-interop/commit/9bb34b17909843691b3584fd99303ec8b5dc6983">9bb34b17909843691b3584fd99303ec8b5dc6983</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.
<p>Publish Date: 2021-11-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23807>CVE-2021-23807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807</a></p>
<p>Release Date: 2021-11-03</p>
<p>Fix Resolution: jsonpointer - 5.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"jsonpointer","packageVersion":"4.0.1","packageFilePaths":["/dsl/interop-lang-vscode/package.json"],"isTransitiveDependency":true,"dependencyTree":"vscode:1.1.14;gulp-remote-src:0.4.3;request:2.79.0;har-validator:2.0.6;is-my-json-valid:2.17.2;jsonpointer:4.0.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jsonpointer - 5.0.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23807","vulnerabilityDetails":"This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23807","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in jsonpointer tgz cve high severity vulnerability vulnerable library jsonpointer tgz simple json addressing library home page a href path to dependency file plexus interop dsl interop lang vscode package json path to vulnerable library plexus interop dsl interop lang vscode node modules jsonpointer package json dependency hierarchy vscode tgz root library gulp remote src tgz request tgz har validator tgz is my json valid tgz x jsonpointer tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package jsonpointer before a type confusion vulnerability can lead to a bypass of a previous prototype pollution fix when the pointer components are arrays publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jsonpointer isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree vscode gulp remote src request har validator is my json valid jsonpointer isminimumfixversionavailable true minimumfixversion jsonpointer basebranches vulnerabilityidentifier cve vulnerabilitydetails this affects the package jsonpointer before a type confusion vulnerability can lead to a bypass of a previous prototype pollution fix when the pointer components are arrays vulnerabilityurl
| 0
|
197,826
| 14,944,412,033
|
IssuesEvent
|
2021-01-26 01:24:02
|
nodebots/nodebots-interchange
|
https://api.github.com/repos/nodebots/nodebots-interchange
|
closed
|
Refactor meta instructions from istanbul to jest
|
bug test
|
Remove the istanbul directives and put jest ones in instead.
|
1.0
|
Refactor meta instructions from istanbul to jest - Remove the istanbul directives and put jest ones in instead.
|
non_process
|
refactor meta instructions from istanbul to jest remove the istanbul directives and put jest ones in instead
| 0
|
202,781
| 23,103,665,370
|
IssuesEvent
|
2022-07-27 06:47:02
|
elastic/integrations
|
https://api.github.com/repos/elastic/integrations
|
closed
|
[Cisco Umbrella] Proxy Support
|
enhancement Team:Security-External Integrations Integration:Cisco Umbrella
|
Users who want to run a proxy with our Umbrella currently are currently blocked from doing so, as we don't expose the proxy settings within the integrations config. Can we add proxy_url to the settings [here](https://github.com/elastic/integrations/blob/main/packages/cisco_umbrella/data_stream/log/agent/stream/aws-s3.yml.hbs), and ensure the config is expose within the UI also.
|
True
|
[Cisco Umbrella] Proxy Support - Users who want to run a proxy with our Umbrella currently are currently blocked from doing so, as we don't expose the proxy settings within the integrations config. Can we add proxy_url to the settings [here](https://github.com/elastic/integrations/blob/main/packages/cisco_umbrella/data_stream/log/agent/stream/aws-s3.yml.hbs), and ensure the config is expose within the UI also.
|
non_process
|
proxy support users who want to run a proxy with our umbrella currently are currently blocked from doing so as we don t expose the proxy settings within the integrations config can we add proxy url to the settings and ensure the config is expose within the ui also
| 0
|
5,092
| 7,878,563,122
|
IssuesEvent
|
2018-06-26 10:38:22
|
RustyPanda/zoobot
|
https://api.github.com/repos/RustyPanda/zoobot
|
opened
|
Add Gaussian or Poisson noise
|
Preprocessing enhancement
|
Estimate background and add a little fake noise.
See:
https://www.tensorflow.org/api_docs/python/tf/contrib/distributions/percentile (quick background estimate)
https://www.tensorflow.org/api_docs/python/tf/contrib/distributions/Poisson (inject appropriate noise)
|
1.0
|
Add Gaussian or Poisson noise - Estimate background and add a little fake noise.
See:
https://www.tensorflow.org/api_docs/python/tf/contrib/distributions/percentile (quick background estimate)
https://www.tensorflow.org/api_docs/python/tf/contrib/distributions/Poisson (inject appropriate noise)
|
process
|
add gaussian or poisson noise estimate background and add a little fake noise see quick background estimate inject appropriate noise
| 1
|
6,747
| 9,873,452,457
|
IssuesEvent
|
2019-06-22 14:37:30
|
aiidateam/aiida_core
|
https://api.github.com/repos/aiidateam/aiida_core
|
opened
|
Remove process function docstring as default for node description
|
aiida-core 1.x priority/nice-to-have topic/processes type/enhancement
|
The idea was nice, but if one writes a decently sized docstring, this content will be stored each time the function is run. Besides this is already stored in the repository, which is a better place for it as it does not need to be queryable
|
1.0
|
Remove process function docstring as default for node description - The idea was nice, but if one writes a decently sized docstring, this content will be stored each time the function is run. Besides this is already stored in the repository, which is a better place for it as it does not need to be queryable
|
process
|
remove process function docstring as default for node description the idea was nice but if one writes a decently sized docstring this content will be stored each time the function is run besides this is already stored in the repository which is a better place for it as it does not need to be queryable
| 1
|
2,668
| 5,447,092,502
|
IssuesEvent
|
2017-03-07 12:35:27
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
opened
|
replace equation cross references with those done for json
|
preprocessor
|
Replace original version done in 4976b2b3359688e9b1d000bdf05f95d5b6071ac3, f546368252 with those done in 0ccc82300c3046d159554c24027ded09a93b687e
The changes in 0ccc82300c3046d159554c24027ded09a93b687e contain information on lags whereas the original ones don't
|
1.0
|
replace equation cross references with those done for json - Replace original version done in 4976b2b3359688e9b1d000bdf05f95d5b6071ac3, f546368252 with those done in 0ccc82300c3046d159554c24027ded09a93b687e
The changes in 0ccc82300c3046d159554c24027ded09a93b687e contain information on lags whereas the original ones don't
|
process
|
replace equation cross references with those done for json replace original version done in with those done in the changes in contain information on lags whereas the original ones don t
| 1
|
22,100
| 30,630,307,471
|
IssuesEvent
|
2023-07-24 14:10:01
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
hpcflow-new2 0.2.0a66 has 2 GuardDog issues
|
guarddog exec-base64 silent-process-execution
|
https://pypi.org/project/hpcflow-new2
https://inspector.pypi.io/project/hpcflow-new2
```{
"dependency": "hpcflow-new2",
"version": "0.2.0a66",
"result": {
"issues": 2,
"errors": {},
"results": {
"exec-base64": [
{
"location": "hpcflow_new2-0.2.0a66/hpcflow/sdk/submission/jobscript.py:998",
"code": " init_proc = subprocess.Popen(\n args=args,\n cwd=str(self.workflow.path),\n creationflags=subprocess.CREATE_NO_WINDOW,\n )",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
],
"silent-process-execution": [
{
"location": "hpcflow_new2-0.2.0a66/hpcflow/sdk/helper/helper.py:112",
"code": " proc = subprocess.Popen(\n args=args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n **kwargs,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp6docl_oh/hpcflow-new2"
}
}```
|
1.0
|
hpcflow-new2 0.2.0a66 has 2 GuardDog issues - https://pypi.org/project/hpcflow-new2
https://inspector.pypi.io/project/hpcflow-new2
```{
"dependency": "hpcflow-new2",
"version": "0.2.0a66",
"result": {
"issues": 2,
"errors": {},
"results": {
"exec-base64": [
{
"location": "hpcflow_new2-0.2.0a66/hpcflow/sdk/submission/jobscript.py:998",
"code": " init_proc = subprocess.Popen(\n args=args,\n cwd=str(self.workflow.path),\n creationflags=subprocess.CREATE_NO_WINDOW,\n )",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
],
"silent-process-execution": [
{
"location": "hpcflow_new2-0.2.0a66/hpcflow/sdk/helper/helper.py:112",
"code": " proc = subprocess.Popen(\n args=args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n **kwargs,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp6docl_oh/hpcflow-new2"
}
}```
|
process
|
hpcflow has guarddog issues dependency hpcflow version result issues errors results exec location hpcflow hpcflow sdk submission jobscript py code init proc subprocess popen n args args n cwd str self workflow path n creationflags subprocess create no window n message this package contains a call to the eval function with a encoded string as argument nthis is a common method used to hide a malicious payload in a module as static analysis will not decode the nstring n silent process execution location hpcflow hpcflow sdk helper helper py code proc subprocess popen n args args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n kwargs n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp oh hpcflow
| 1
|
14,359
| 17,380,772,403
|
IssuesEvent
|
2021-07-31 17:06:35
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
Crash in dt 3.7.0 when moving mouse in tone equalizer
|
bug: pending reproduce: confirmed scope: image processing
|
Recently I started getting consistent crashes with the latest dev versions of dt.
To reproduce:
- open an image in darkroom
- activate tone equalizer
- move mouse over the image (pointer turns into crosshair)
- move wheel to make a random adjustment (this step may not be necessary)
- move pointer towards the top edge of the image
- when pointer crosses top border dt freezes and dumps core
I get this behavior consistently with most images, both raw and jpg.
Also I tried with a fresh configdir and it still happens.
dt: 3.7.0~git586.ededff1666 provided by openSUSE
distro: openSUSE Tumbleweed
graphic: Integrated Intel
openCL: no
desktop: Plasma/KDE
[darktable_bt_PVFI70.txt](https://github.com/darktable-org/darktable/files/6911092/darktable_bt_PVFI70.txt)
|
1.0
|
Crash in dt 3.7.0 when moving mouse in tone equalizer - Recently I started getting consistent crashes with the latest dev versions of dt.
To reproduce:
- open an image in darkroom
- activate tone equalizer
- move mouse over the image (pointer turns into crosshair)
- move wheel to make a random adjustment (this step may not be necessary)
- move pointer towards the top edge of the image
- when pointer crosses top border dt freezes and dumps core
I get this behavior consistently with most images, both raw and jpg.
Also I tried with a fresh configdir and it still happens.
dt: 3.7.0~git586.ededff1666 provided by openSUSE
distro: openSUSE Tumbleweed
graphic: Integrated Intel
openCL: no
desktop: Plasma/KDE
[darktable_bt_PVFI70.txt](https://github.com/darktable-org/darktable/files/6911092/darktable_bt_PVFI70.txt)
|
process
|
crash in dt when moving mouse in tone equalizer recently i started getting consistent crashes with the latest dev versions of dt to reproduce open an image in darkroom activate tone equalizer move mouse over the image pointer turns into crosshair move wheel to make a random adjustment this step may not be necessary move pointer towards the top edge of the image when pointer crosses top border dt freezes and dumps core i get this behavior consistently with most images both raw and jpg also i tried with a fresh configdir and it still happens dt provided by opensuse distro opensuse tumbleweed graphic integrated intel opencl no desktop plasma kde
| 1
|
2,555
| 5,311,494,658
|
IssuesEvent
|
2017-02-13 04:03:34
|
spootTheLousy/saguaro
|
https://api.github.com/repos/spootTheLousy/saguaro
|
closed
|
Regist doesn't parse newlines
|
Post/text processing Regist
|
With latest master build, line breaks do not appear in comments.
Demo: https://saguaro.w4ch.xyz/s/
|
1.0
|
Regist doesn't parse newlines - With latest master build, line breaks do not appear in comments.
Demo: https://saguaro.w4ch.xyz/s/
|
process
|
regist doesn t parse newlines with latest master build line breaks do not appear in comments demo
| 1
|
16,032
| 20,188,245,402
|
IssuesEvent
|
2022-02-11 01:21:16
|
savitamittalmsft/WAS-SEC-TEST
|
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
|
opened
|
Continuously assess and monitor compliance
|
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Security & Compliance Compliance
|
<a href="https://docs.microsoft.com/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance">Continuously assess and monitor compliance</a>
<p><b>Why Consider This?</b></p>
Continuous monitoring and assessing workload increases the overall security and compliance of your workload in Azure
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Use Azure Defender (Azure Security Center) to"nbsp; continuously assess and monitor your compliance score. </span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance" target="_blank"><span>https://docs.microsoft.com/en-us/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance</span></a><span /></p>
|
1.0
|
Continuously assess and monitor compliance - <a href="https://docs.microsoft.com/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance">Continuously assess and monitor compliance</a>
<p><b>Why Consider This?</b></p>
Continuous monitoring and assessing workload increases the overall security and compliance of your workload in Azure
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Use Azure Defender (Azure Security Center) to"nbsp; continuously assess and monitor your compliance score. </span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance" target="_blank"><span>https://docs.microsoft.com/en-us/azure/security-center/security-center-compliance-dashboard#assess-your-regulatory-compliance</span></a><span /></p>
|
process
|
continuously assess and monitor compliance why consider this continuous monitoring and assessing workload increases the overall security and compliance of your workload in azure context suggested actions use azure defender azure security center to nbsp continuously assess and monitor your compliance score learn more
| 1
|
30,513
| 7,208,190,604
|
IssuesEvent
|
2018-02-07 01:40:46
|
WayofTime/BloodMagic
|
https://api.github.com/repos/WayofTime/BloodMagic
|
closed
|
(1.12.2) Crash After Trying to Use Teleposer
|
1.12 bug code complete
|
Spawned in a teleposer sigil in creative mode, bound it to myself, placed a teleposer block, right clicked the sigil on the block to link the two, moved a small distance away, used right-click to teleport, game crashed.
Crashlog once again: https://pastebin.com/hhQ7W6D2
I suppose this is to be expected from an item that has a big 'WIP' in the guide book page about it, and I did see a mention of a rewrite for the teleposer in the roadmap anyway.
- BloodMagic: 1.12.2-2.2.0-82
- Minecraft: 1.12.2
- Forge: 14.23.1.2611
|
1.0
|
(1.12.2) Crash After Trying to Use Teleposer - Spawned in a teleposer sigil in creative mode, bound it to myself, placed a teleposer block, right clicked the sigil on the block to link the two, moved a small distance away, used right-click to teleport, game crashed.
Crashlog once again: https://pastebin.com/hhQ7W6D2
I suppose this is to be expected from an item that has a big 'WIP' in the guide book page about it, and I did see a mention of a rewrite for the teleposer in the roadmap anyway.
- BloodMagic: 1.12.2-2.2.0-82
- Minecraft: 1.12.2
- Forge: 14.23.1.2611
|
non_process
|
crash after trying to use teleposer spawned in a teleposer sigil in creative mode bound it to myself placed a teleposer block right clicked the sigil on the block to link the two moved a small distance away used right click to teleport game crashed crashlog once again i suppose this is to be expected from an item that has a big wip in the guide book page about it and i did see a mention of a rewrite for the teleposer in the roadmap anyway bloodmagic minecraft forge
| 0
|
5,826
| 8,664,259,485
|
IssuesEvent
|
2018-11-28 19:39:11
|
bitshares/bitshares-community-ui
|
https://api.github.com/repos/bitshares/bitshares-community-ui
|
closed
|
Dropdown Component
|
P1 process ui
|
Create a `Dropdown` and place it into `Header` that should look like this:

No sub-menus (just one list). Should receive list a menu-items as props and $emit item's name on click. On mobile should take all available width.
|
1.0
|
Dropdown Component - Create a `Dropdown` and place it into `Header` that should look like this:

No sub-menus (just one list). Should receive list a menu-items as props and $emit item's name on click. On mobile should take all available width.
|
process
|
dropdown component create a dropdown and place it into header that should look like this no sub menus just one list should receive list a menu items as props and emit item s name on click on mobile should take all available width
| 1
|
353
| 2,793,534,889
|
IssuesEvent
|
2015-05-11 11:44:50
|
ecodistrict/IDSSDashboard
|
https://api.github.com/repos/ecodistrict/IDSSDashboard
|
closed
|
Qualitative KPI's
|
form feedback 09102014 process step: analyse problem question
|
KPI’s are only quantitative? What about unquantifiable KPI’s? How to bring in qualitative information?
|
1.0
|
Qualitative KPI's - KPI’s are only quantitative? What about unquantifiable KPI’s? How to bring in qualitative information?
|
process
|
qualitative kpi s kpi’s are only quantitative what about unquantifiable kpi’s how to bring in qualitative information
| 1
|
16,641
| 21,707,264,497
|
IssuesEvent
|
2022-05-10 10:46:09
|
sjmog/smartflix
|
https://api.github.com/repos/sjmog/smartflix
|
opened
|
Scheduled Enrichment
|
Rails/Rake 04-background-processing Rails/Scheduled tasks cron
|
We've just built a tested show route for shows, that enriches show data in the background in a secure way.
Now we're going to schedule these enrichment requests, so they run in the background periodically and keep our database up-to-date.
In this challenge, we’ll run a worker every day to go through all the records and update them. To define the frequency we need to use a **scheduler**.
> There are many ways to define a schedule, but we recommend you get familiar with the near-ubiquitous **cron**.
## To complete this challenge, you will need to
- [ ] Create a new Sidekiq worker that updates all show records using data from the OMDb API.
- [ ] Install a library of your choice to schedule periodic jobs.
- [ ] Configure a periodic job to update all show records at 7am each day.
- [ ] Deploy the application to production – including the scheduled task.
## Tips
- Don't forget to write tests!
- `cron` is an old tool, so it's a bit obscure. [crontab guru](https://crontab.guru/) is a great tool to generate cron schedules.
- Consider using [Sidekiq scheduler](https://github.com/moove-it/sidekiq-scheduler), [Sidekiq cron]()https://github.com/ondrejbartas/sidekiq-cron), [Simple Scheduler](https://github.com/simplymadeapps/simple_scheduler) and/or [Whenever](https://github.com/javan/whenever). Or check out the [Ruby Toolbox](https://www.ruby-toolbox.com/categories/scheduling) for more information!
|
1.0
|
Scheduled Enrichment - We've just built a tested show route for shows, that enriches show data in the background in a secure way.
Now we're going to schedule these enrichment requests, so they run in the background periodically and keep our database up-to-date.
In this challenge, we’ll run a worker every day to go through all the records and update them. To define the frequency we need to use a **scheduler**.
> There are many ways to define a schedule, but we recommend you get familiar with the near-ubiquitous **cron**.
## To complete this challenge, you will need to
- [ ] Create a new Sidekiq worker that updates all show records using data from the OMDb API.
- [ ] Install a library of your choice to schedule periodic jobs.
- [ ] Configure a periodic job to update all show records at 7am each day.
- [ ] Deploy the application to production – including the scheduled task.
## Tips
- Don't forget to write tests!
- `cron` is an old tool, so it's a bit obscure. [crontab guru](https://crontab.guru/) is a great tool to generate cron schedules.
- Consider using [Sidekiq scheduler](https://github.com/moove-it/sidekiq-scheduler), [Sidekiq cron]()https://github.com/ondrejbartas/sidekiq-cron), [Simple Scheduler](https://github.com/simplymadeapps/simple_scheduler) and/or [Whenever](https://github.com/javan/whenever). Or check out the [Ruby Toolbox](https://www.ruby-toolbox.com/categories/scheduling) for more information!
|
process
|
scheduled enrichment we ve just built a tested show route for shows that enriches show data in the background in a secure way now we re going to schedule these enrichment requests so they run in the background periodically and keep our database up to date in this challenge we’ll run a worker every day to go through all the records and update them to define the frequency we need to use a scheduler there are many ways to define a schedule but we recommend you get familiar with the near ubiquitous cron to complete this challenge you will need to create a new sidekiq worker that updates all show records using data from the omdb api install a library of your choice to schedule periodic jobs configure a periodic job to update all show records at each day deploy the application to production – including the scheduled task tips don t forget to write tests cron is an old tool so it s a bit obscure is a great tool to generate cron schedules consider using and or or check out the for more information
| 1
|
271,733
| 29,659,368,853
|
IssuesEvent
|
2023-06-10 01:22:29
|
Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034
|
https://api.github.com/repos/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034
|
closed
|
CVE-2020-35508 (Medium) detected in linuxlinux-4.19.239 - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2020-35508 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.239</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034/commit/19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6">19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw possibility of race condition and incorrect initialization of the process id was found in the Linux kernel child/parent process identification handling while filtering signal handlers. A local attacker is able to abuse this flaw to bypass checks to send any signal to a privileged process.
<p>Publish Date: 2021-03-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-35508>CVE-2020-35508</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-35508">https://www.linuxkernelcves.com/cves/CVE-2020-35508</a></p>
<p>Release Date: 2021-03-26</p>
<p>Fix Resolution: v4.4.242, v4.9.242, v4.14.205, v4.19.156, v5.4.76, v5.9.7, v5.10-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-35508 (Medium) detected in linuxlinux-4.19.239 - autoclosed - ## CVE-2020-35508 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.239</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034/commit/19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6">19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw possibility of race condition and incorrect initialization of the process id was found in the Linux kernel child/parent process identification handling while filtering signal handlers. A local attacker is able to abuse this flaw to bypass checks to send any signal to a privileged process.
<p>Publish Date: 2021-03-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-35508>CVE-2020-35508</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-35508">https://www.linuxkernelcves.com/cves/CVE-2020-35508</a></p>
<p>Release Date: 2021-03-26</p>
<p>Fix Resolution: v4.4.242, v4.9.242, v4.14.205, v4.19.156, v5.4.76, v5.9.7, v5.10-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linuxlinux autoclosed cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details a flaw possibility of race condition and incorrect initialization of the process id was found in the linux kernel child parent process identification handling while filtering signal handlers a local attacker is able to abuse this flaw to bypass checks to send any signal to a privileged process publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
303,454
| 26,208,419,897
|
IssuesEvent
|
2023-01-04 02:24:11
|
SETI/pds-oops
|
https://api.github.com/repos/SETI/pds-oops
|
opened
|
polymath/polynomial.py is missing unit tests
|
Priority 4 Useful B-Polymath A-Enhancement B-Tests Effort 2 Medium
|
The unit test file `unit_tests/test_polynomial.py` does not have any tests in it. This is the only major part of polymath that is not covered by unit tests.
|
1.0
|
polymath/polynomial.py is missing unit tests - The unit test file `unit_tests/test_polynomial.py` does not have any tests in it. This is the only major part of polymath that is not covered by unit tests.
|
non_process
|
polymath polynomial py is missing unit tests the unit test file unit tests test polynomial py does not have any tests in it this is the only major part of polymath that is not covered by unit tests
| 0
|
340,363
| 24,651,397,031
|
IssuesEvent
|
2022-10-17 18:58:31
|
appliedAI-Initiative/pyDVL
|
https://api.github.com/repos/appliedAI-Initiative/pyDVL
|
opened
|
Use nbpshinx gallery
|
documentation enhancement good first issue
|
To show a nice gallery of examples instead of a boring list.
There seems to be a problem with `sphinx-math-dollar` though. If we add something like
```rst
.. nbgallery::
:caption: Examples
:name: examples-gallery
:glob:
*
```
to `examples/index.rst`, then the build fails with:
```
Exception occurred:
File "(...)/pydvl/.tox/docs-dev/lib/python3.8/site-packages/docutils/nodes.py", line 2044, in unknown_visit
raise NotImplementedError(
NotImplementedError: <class 'sphinx_math_dollar.extension.MathDollarReplacer'> visiting unknown node type: GalleryToc
```
|
1.0
|
Use nbpshinx gallery - To show a nice gallery of examples instead of a boring list.
There seems to be a problem with `sphinx-math-dollar` though. If we add something like
```rst
.. nbgallery::
:caption: Examples
:name: examples-gallery
:glob:
*
```
to `examples/index.rst`, then the build fails with:
```
Exception occurred:
File "(...)/pydvl/.tox/docs-dev/lib/python3.8/site-packages/docutils/nodes.py", line 2044, in unknown_visit
raise NotImplementedError(
NotImplementedError: <class 'sphinx_math_dollar.extension.MathDollarReplacer'> visiting unknown node type: GalleryToc
```
|
non_process
|
use nbpshinx gallery to show a nice gallery of examples instead of a boring list there seems to be a problem with sphinx math dollar though if we add something like rst nbgallery caption examples name examples gallery glob to examples index rst then the build fails with exception occurred file pydvl tox docs dev lib site packages docutils nodes py line in unknown visit raise notimplementederror notimplementederror visiting unknown node type gallerytoc
| 0
|
2,223
| 5,072,300,652
|
IssuesEvent
|
2016-12-26 21:32:38
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
opened
|
CumulativeSum and CumulativeCount do not work with post-aggregation math expressions
|
Query Processor
|
`CumulativeSum(Total) + 1` gives the error `No matching clause: :cumulative-sum`
`CumultaiveCount + 1` gives the error `Assert failed: Aggregations of type ':cumulative-count' must specify a field. (= aggregation-type :count)`
|
1.0
|
CumulativeSum and CumulativeCount do not work with post-aggregation math expressions - `CumulativeSum(Total) + 1` gives the error `No matching clause: :cumulative-sum`
`CumultaiveCount + 1` gives the error `Assert failed: Aggregations of type ':cumulative-count' must specify a field. (= aggregation-type :count)`
|
process
|
cumulativesum and cumulativecount do not work with post aggregation math expressions cumulativesum total gives the error no matching clause cumulative sum cumultaivecount gives the error assert failed aggregations of type cumulative count must specify a field aggregation type count
| 1
|
21,928
| 30,446,559,002
|
IssuesEvent
|
2023-07-15 18:48:28
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pyutils 0.0.1b8 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b8",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp_et5qdoe/pyutils"
}
}```
|
1.0
|
pyutils 0.0.1b8 has 2 GuardDog issues - https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b8",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:204",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp_et5qdoe/pyutils"
}
}```
|
process
|
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pytils python utils silent process execution location pyutils exec utils py pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmp pyutils
| 1
|
23,700
| 3,869,003,630
|
IssuesEvent
|
2016-04-10 10:22:16
|
ubuntudesign/ubuntu-vanilla-theme
|
https://api.github.com/repos/ubuntudesign/ubuntu-vanilla-theme
|
closed
|
IoT > Partners: in quote row the quote is missing space before closing quotation mark
|
design-review
|
This also happens further down the page in the quotes in what our partners are saying
|
1.0
|
IoT > Partners: in quote row the quote is missing space before closing quotation mark - This also happens further down the page in the quotes in what our partners are saying
|
non_process
|
iot partners in quote row the quote is missing space before closing quotation mark this also happens further down the page in the quotes in what our partners are saying
| 0
|
14,346
| 17,371,648,529
|
IssuesEvent
|
2021-07-30 14:44:36
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Invalid YAML code
|
devops-cicd-process/tech devops/prod doc-bug
|
The example code provided isn't even valid. Everything below `- deployment` should be indented one level. The provided example gives 3 errors:
- /azure-pipelines.yml (Line: 9, Col: 1): Unexpected value 'displayName'
- /azure-pipelines.yml (Line: 10, Col: 1): Unexpected value 'environment'
- /azure-pipelines.yml (Line: 13, Col: 1): Unexpected value 'strategy'
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops#use-virtual-machine-in-pipelines
```yaml
trigger:
- main
pool:
vmImage: ubuntu-latest
jobs:
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- script: echo "Hello world $(date)"
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 91d0d31f-81ee-c024-db7e-daddbf525f71
* Version Independent ID: 330f1649-386c-d0aa-5f96-b8343a1480d3
* Content: [Environment - Virtual machine resource - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops)
* Content Source: [docs/pipelines/process/environments-virtual-machines.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/environments-virtual-machines.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Invalid YAML code -
The example code provided isn't even valid. Everything below `- deployment` should be indented one level. The provided example gives 3 errors:
- /azure-pipelines.yml (Line: 9, Col: 1): Unexpected value 'displayName'
- /azure-pipelines.yml (Line: 10, Col: 1): Unexpected value 'environment'
- /azure-pipelines.yml (Line: 13, Col: 1): Unexpected value 'strategy'
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops#use-virtual-machine-in-pipelines
```yaml
trigger:
- main
pool:
vmImage: ubuntu-latest
jobs:
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- script: echo "Hello world $(date)"
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 91d0d31f-81ee-c024-db7e-daddbf525f71
* Version Independent ID: 330f1649-386c-d0aa-5f96-b8343a1480d3
* Content: [Environment - Virtual machine resource - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops)
* Content Source: [docs/pipelines/process/environments-virtual-machines.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/environments-virtual-machines.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
invalid yaml code the example code provided isn t even valid everything below deployment should be indented one level the provided example gives errors azure pipelines yml line col unexpected value displayname azure pipelines yml line col unexpected value environment azure pipelines yml line col unexpected value strategy yaml trigger main pool vmimage ubuntu latest jobs deployment vmdeploy displayname deploy to vm environment name contosodeploy resourcetype virtualmachine strategy runonce deploy steps script echo hello world date document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.