Unnamed: 0 int64 1 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 7 112 | repo_url stringlengths 36 141 | action stringclasses 3 values | title stringlengths 3 438 | labels stringlengths 4 308 | body stringlengths 7 254k | index stringclasses 7 values | text_combine stringlengths 96 254k | label stringclasses 2 values | text stringlengths 96 246k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
177,259 | 6,576,202,431 | IssuesEvent | 2017-09-11 18:54:36 | vmware/vic-product | https://api.github.com/repos/vmware/vic-product | opened | Version Harbor/Admiral scripts | component/ova priority/medium | **User Statement:**
As a person I want things the OVA build depends on to be versioned
**Details:**
https://github.com/vmware/harbor/issues/3191
https://github.com/vmware/admiral/issues/201
https://github.com/vmware/admiral/issues/202
**Acceptance Criteria:**
- [ ] Update build to use versioned artifacts | 1.0 | Version Harbor/Admiral scripts - **User Statement:**
As a person I want things the OVA build depends on to be versioned
**Details:**
https://github.com/vmware/harbor/issues/3191
https://github.com/vmware/admiral/issues/201
https://github.com/vmware/admiral/issues/202
**Acceptance Criteria:**
- [ ] Update build to use versioned artifacts | non_main | version harbor admiral scripts user statement as a person i want things the ova build depends on to be versioned details acceptance criteria update build to use versioned artifacts | 0 |
153,828 | 13,528,429,539 | IssuesEvent | 2020-09-15 16:42:34 | BuildForSDGCohort2/Team-180-a-Backend | https://api.github.com/repos/BuildForSDGCohort2/Team-180-a-Backend | closed | Student Registration | documentation enhancement | **What Functionality / Service / Endpoint would you like to add?**
Enabling student registration
**Describe this Functionality / Service / Endpoint**
To create a secure pathway for students to register on the E-learning platform using jsonwebtoken and other authentication technologies.
**Duplicate prevention**
No duplicates. | 1.0 | Student Registration - **What Functionality / Service / Endpoint would you like to add?**
Enabling student registration
**Describe this Functionality / Service / Endpoint**
To create a secure pathway for students to register on the E-learning platform using jsonwebtoken and other authentication technologies.
**Duplicate prevention**
No duplicates. | non_main | student registration what functionality service endpoint would you like to add enabling student registration describe this functionality service endpoint to create a secure pathway for students to register on the e learning platform using jsonwebtoken and other authentication technologies duplicate prevention no duplicates | 0 |
273,549 | 20,797,056,414 | IssuesEvent | 2022-03-17 10:18:55 | microcks/microcks | https://api.github.com/repos/microcks/microcks | closed | Fix API documentation on operation additional headers when testing | component/tests component/documentation | It appears that the API documentation of Microcks has an error on the `POST /api/test` operation.
As it is described, you may want to invoke the API using request body like:
```json
{
"serviceId":"My APIs:v0.0.1",
"testEndpoint":"http://app:9090/api",
"runnerType":"OPEN_API_SCHEMA",
"timeout":10000,
"filteredOperations":[],
"operationsHeaders":{
"globals":[
{"name":"LoginId","values":["LoginUser","UserAlias"]}
]
}
}
```
but the `values` of additional headers are in fact simple strings where multiple values should be comma separated.
Here's the correct payload:
```json
{
"serviceId":"My APIs:v0.0.1",
"testEndpoint":"http://app:9090/api",
"runnerType":"OPEN_API_SCHEMA",
"timeout":10000,
"filteredOperations":[],
"operationsHeaders":{
"globals":[
{"name":"LoginId","values": "LoginUser,UserAlias"}
]
}
}
``` | 1.0 | Fix API documentation on operation additional headers when testing - It appears that the API documentation of Microcks has an error on the `POST /api/test` operation.
As it is described, you may want to invoke the API using request body like:
```json
{
"serviceId":"My APIs:v0.0.1",
"testEndpoint":"http://app:9090/api",
"runnerType":"OPEN_API_SCHEMA",
"timeout":10000,
"filteredOperations":[],
"operationsHeaders":{
"globals":[
{"name":"LoginId","values":["LoginUser","UserAlias"]}
]
}
}
```
but the `values` of additional headers are in fact simple strings where multiple values should be comma separated.
Here's the correct payload:
```json
{
"serviceId":"My APIs:v0.0.1",
"testEndpoint":"http://app:9090/api",
"runnerType":"OPEN_API_SCHEMA",
"timeout":10000,
"filteredOperations":[],
"operationsHeaders":{
"globals":[
{"name":"LoginId","values": "LoginUser,UserAlias"}
]
}
}
``` | non_main | fix api documentation on operation additional headers when testing it appears that the api documentation of microcks has an error on the post api test operation as it is described you may want to invoke the api using request body like json serviceid my apis testendpoint runnertype open api schema timeout filteredoperations operationsheaders globals name loginid values but the values of additional headers are in fact simple strings where multiple values should be comma separated here s the correct payload json serviceid my apis testendpoint runnertype open api schema timeout filteredoperations operationsheaders globals name loginid values loginuser useralias | 0 |
4,077 | 19,268,883,886 | IssuesEvent | 2021-12-10 01:23:20 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | Working examples of using CORS with Lambdas that respond to HttpAPI events | type/question maintainer/need-response | ### Describe your idea/feature/enhancement
I wish the documentation had a working example of a Lambda function with CORS. Seeing as a Lambda responding to an HttpAPI event _CANNOT_ function without CORS being enabled, this is kind of essential to get a working example.
### Proposal
There should be two examples:
1. One where the `Type: Api` Event for a function supports CORS.
2. Another example of the resources/configuration required to explicitly declare an working HttpAPI with the appropriate CORS/permissions to invoke the Lambda.
Things to consider:
1. Not sure this will require updates to the [SAM Spec](https://github.com/awslabs/serverless-application-model), but given the challenges I encountered trying to get this to work, most likely.
### Additional Details
How about having the Sessions with SAM guys talk about this?
| True | Working examples of using CORS with Lambdas that respond to HttpAPI events - ### Describe your idea/feature/enhancement
I wish the documentation had a working example of a Lambda function with CORS. Seeing as a Lambda responding to an HttpAPI event _CANNOT_ function without CORS being enabled, this is kind of essential to get a working example.
### Proposal
There should be two examples:
1. One where the `Type: Api` Event for a function supports CORS.
2. Another example of the resources/configuration required to explicitly declare an working HttpAPI with the appropriate CORS/permissions to invoke the Lambda.
Things to consider:
1. Not sure this will require updates to the [SAM Spec](https://github.com/awslabs/serverless-application-model), but given the challenges I encountered trying to get this to work, most likely.
### Additional Details
How about having the Sessions with SAM guys talk about this?
| main | working examples of using cors with lambdas that respond to httpapi events describe your idea feature enhancement i wish the documentation had a working example of a lambda function with cors seeing as a lambda responding to an httpapi event cannot function without cors being enabled this is kind of essential to get a working example proposal there should be two examples one where the type api event for a function supports cors another example of the resources configuration required to explicitly declare an working httpapi with the appropriate cors permissions to invoke the lambda things to consider not sure this will require updates to the but given the challenges i encountered trying to get this to work most likely additional details how about having the sessions with sam guys talk about this | 1 |
56,947 | 13,956,715,256 | IssuesEvent | 2020-10-24 02:36:14 | GoogleCloudPlatform/java-docs-samples | https://api.github.com/repos/GoogleCloudPlatform/java-docs-samples | opened | The build failed | buildcop: issue priority: p1 type: bug | This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: b58810a3ce99213ec8278a97a494d3d4a7a69460
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e37285c8-c6ab-4eaa-b76d-75fe0d0dfbce), [Sponge](http://sponge2/e37285c8-c6ab-4eaa-b76d-75fe0d0dfbce)
status: failed
<details><summary>Test output</summary><br><pre>com.google.cloud.spanner.SpannerException:
RESOURCE_EXHAUSTED: com.google.api.gax.rpc.ResourceExhaustedException: io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
at com.example.spanner.AsyncExamplesIT.dropTestDatabase(AsyncExamplesIT.java:111)
Caused by: java.util.concurrent.ExecutionException:
com.google.api.gax.rpc.ResourceExhaustedException: io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
at com.example.spanner.AsyncExamplesIT.dropTestDatabase(AsyncExamplesIT.java:111)
Caused by: com.google.api.gax.rpc.ResourceExhaustedException:
io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
Caused by: io.grpc.StatusRuntimeException:
RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
</pre></details> | 1.0 | The build failed - This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: b58810a3ce99213ec8278a97a494d3d4a7a69460
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e37285c8-c6ab-4eaa-b76d-75fe0d0dfbce), [Sponge](http://sponge2/e37285c8-c6ab-4eaa-b76d-75fe0d0dfbce)
status: failed
<details><summary>Test output</summary><br><pre>com.google.cloud.spanner.SpannerException:
RESOURCE_EXHAUSTED: com.google.api.gax.rpc.ResourceExhaustedException: io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
at com.example.spanner.AsyncExamplesIT.dropTestDatabase(AsyncExamplesIT.java:111)
Caused by: java.util.concurrent.ExecutionException:
com.google.api.gax.rpc.ResourceExhaustedException: io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
at com.example.spanner.AsyncExamplesIT.dropTestDatabase(AsyncExamplesIT.java:111)
Caused by: com.google.api.gax.rpc.ResourceExhaustedException:
io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
Caused by: io.grpc.StatusRuntimeException:
RESOURCE_EXHAUSTED: Quota exceeded for quota metric 'Administrative requests' and limit 'Administrative requests per minute' of service 'spanner.googleapis.com' for consumer 'project_number:779844219229'.
links {
description: "Google developer console API key"
url: "https://console.developers.google.com/project/779844219229/apiui/credential"
}
</pre></details> | non_main | the build failed this test failed to configure my behavior see if i m commenting on this issue too often add the buildcop quiet label and i will stop commenting commit buildurl status failed test output com google cloud spanner spannerexception resource exhausted com google api gax rpc resourceexhaustedexception io grpc statusruntimeexception resource exhausted quota exceeded for quota metric administrative requests and limit administrative requests per minute of service spanner googleapis com for consumer project number links description google developer console api key url at com example spanner asyncexamplesit droptestdatabase asyncexamplesit java caused by java util concurrent executionexception com google api gax rpc resourceexhaustedexception io grpc statusruntimeexception resource exhausted quota exceeded for quota metric administrative requests and limit administrative requests per minute of service spanner googleapis com for consumer project number links description google developer console api key url at com example spanner asyncexamplesit droptestdatabase asyncexamplesit java caused by com google api gax rpc resourceexhaustedexception io grpc statusruntimeexception resource exhausted quota exceeded for quota metric administrative requests and limit administrative requests per minute of service spanner googleapis com for consumer project number links description google developer console api key url caused by io grpc statusruntimeexception resource exhausted quota exceeded for quota metric administrative requests and limit administrative requests per minute of service spanner googleapis com for consumer project number links description google developer console api key url | 0 |
2,883 | 10,319,584,302 | IssuesEvent | 2019-08-30 17:56:32 | backdrop-ops/contrib | https://api.github.com/repos/backdrop-ops/contrib | closed | Requesting permission to contribute! | Maintainer application | I am writing a new book for Apress on Backdrop and have ported the Honeypot module and will be porting other D7 modules to support my clients' move to Backdrop.
| True | Requesting permission to contribute! - I am writing a new book for Apress on Backdrop and have ported the Honeypot module and will be porting other D7 modules to support my clients' move to Backdrop.
| main | requesting permission to contribute i am writing a new book for apress on backdrop and have ported the honeypot module and will be porting other modules to support my clients move to backdrop | 1 |
203,761 | 7,077,950,277 | IssuesEvent | 2018-01-10 00:38:40 | google/google-api-java-client | https://api.github.com/repos/google/google-api-java-client | closed | Request Youtube API v3 for android sample | 2–5 stars Type-Sample imported priority: p2 | _From [heoun...@gmail.com](https://code.google.com/u/104507084071449316345/) on December 12, 2012 03:11:44_
Which Google API and version (e.g. Google Calendar API version 3)? Google Youtube API android version 3 Java environment (e.g. Java 6, Android 2.3, App Engine)? Any Android External references, such as API reference guide? API reference guide Please provide any additional information below. I want to implement Youtube v3 in my android application
_Original issue: http://code.google.com/p/google-api-java-client/issues/detail?id=682_
| 1.0 | Request Youtube API v3 for android sample - _From [heoun...@gmail.com](https://code.google.com/u/104507084071449316345/) on December 12, 2012 03:11:44_
Which Google API and version (e.g. Google Calendar API version 3)? Google Youtube API android version 3 Java environment (e.g. Java 6, Android 2.3, App Engine)? Any Android External references, such as API reference guide? API reference guide Please provide any additional information below. I want to implement Youtube v3 in my android application
_Original issue: http://code.google.com/p/google-api-java-client/issues/detail?id=682_
| non_main | request youtube api for android sample from on december which google api and version e g google calendar api version google youtube api android version java environment e g java android app engine any android external references such as api reference guide api reference guide please provide any additional information below i want to implement youtube in my android application original issue | 0 |
40,605 | 8,815,733,383 | IssuesEvent | 2018-12-29 22:40:25 | dwevins/character-tracker | https://api.github.com/repos/dwevins/character-tracker | opened | Class "field-group--alt" is unnecessary | code improvement | "field-group--alt" only ends up targeting fields directly, and modifies no styles for the field group. It should be removed from the markup and stylesheets, which will also enable the field width modifiers to work in any field group. | 1.0 | Class "field-group--alt" is unnecessary - "field-group--alt" only ends up targeting fields directly, and modifies no styles for the field group. It should be removed from the markup and stylesheets, which will also enable the field width modifiers to work in any field group. | non_main | class field group alt is unnecessary field group alt only ends up targeting fields directly and modifies no styles for the field group it should be removed from the markup and stylesheets which will also enable the field width modifiers to work in any field group | 0 |
5,705 | 30,082,976,025 | IssuesEvent | 2023-06-29 06:10:11 | toolbx-images/images | https://api.github.com/repos/toolbx-images/images | opened | Add distribution: Debian 12 (bookworm) | new-image-request maintainers-wanted | ### Distribution name and versions requested
Debian 12 (Bookworm)
### Where are the official container images from the distribution published?
https://hub.docker.com/_/debian
### Will you be interested in maintaining this image?
Yes | True | Add distribution: Debian 12 (bookworm) - ### Distribution name and versions requested
Debian 12 (Bookworm)
### Where are the official container images from the distribution published?
https://hub.docker.com/_/debian
### Will you be interested in maintaining this image?
Yes | main | add distribution debian bookworm distribution name and versions requested debian bookworm where are the official container images from the distribution published will you be interested in maintaining this image yes | 1 |
11,922 | 7,740,174,967 | IssuesEvent | 2018-05-28 19:58:25 | agda/agda | https://api.github.com/repos/agda/agda | closed | Possible performance problem | meta performance | Running the attached file with the definition
```
∨-cong {x₁} {x₂} {y₁} {y₂} (mk x₁→x₂ , mk x₂→x₁) (mk y₁→y₂ , mk y₂→y₁) = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (x₁→x₂ x₁)
lemma₁ (∨-inr y₁) = ∨-inr (y₁→y₂ y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (x₂→x₁ x₂)
lemma₂ (∨-inr y₂) = ∨-inr (y₂→y₁ y₂)
```
results in
Total 34,211ms
Miscellaneous 15ms
Parsing.Operators 156ms
Deserialization 0ms
Scoping 46ms
Scoping.InverseScopeLookup 18,657ms
Typing 3,088ms
Typing.OccursCheck 468ms
Typing.CheckLHS 5,662ms
Typing.CheckLHS.UnifyIndices 436ms
Termination 0ms
Termination.RecCheck 265ms
Positivity 1,326ms
Injectivity 0ms
ProjectionLikeness 31ms
Coverage 327ms
Highlighting 15ms
Serialization 1,528ms
Serialization.Sort 15ms
Serialization.BinaryEncode 187ms
Serialization.Compress 0ms
DeadCode 1,981ms
Changing the definition to
```
∨-cong {x₁} {x₂} {y₁} {y₂} (x₁⇒x₂ , x₂⇒x₁) (y₁⇒y₂ , y₂⇒y₁) = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (⇒-fst x₁⇒x₂ x₁)
lemma₁ (∨-inr y₁) = ∨-inr (⇒-fst y₁⇒y₂ y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (⇒-fst x₂⇒x₁ x₂)
lemma₂ (∨-inr y₂) = ∨-inr (⇒-fst y₂⇒y₁ y₂)
```
changes the profile to
Total 149,963ms
Miscellaneous 15ms
Parsing.Operators 140ms
Deserialization 0ms
Scoping 62ms
Scoping.InverseScopeLookup 18,922ms
Typing 22,058ms
Typing.OccursCheck 17,799ms
Typing.CheckLHS 30,778ms
Typing.CheckLHS.UnifyIndices 2,386ms
Termination 0ms
Termination.RecCheck 4,071ms
Positivity 19,172ms
Injectivity 0ms
ProjectionLikeness 31ms
Coverage 2,215ms
Highlighting 15ms
Serialization 3,010ms
Serialization.Sort 15ms
Serialization.BinaryEncode 218ms
Serialization.Compress 0ms
DeadCode 29,047ms
When using
```
∨-cong {x₁} {x₂} {y₁} {y₂} x₁≈x₂ y₁≈y₂ = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (⇒-fst (⇔-fst x₁≈x₂) x₁)
lemma₁ (∨-inr y₁) = ∨-inr (⇒-fst (⇔-fst y₁≈y₂) y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (⇒-fst (⇔-snd x₁≈x₂) x₂)
lemma₂ (∨-inr y₂) = ∨-inr (⇒-fst (⇔-snd y₁≈y₂) y₂)
```
instead the profile changes to
Total 1,143,471ms
Miscellaneous 0ms
Parsing.Operators 140ms
Deserialization 0ms
Scoping 78ms
Scoping.InverseScopeLookup 18,907ms
Typing 261,395ms
Typing.OccursCheck 7,534ms
Typing.CheckLHS 313,421ms
Typing.CheckLHS.UnifyIndices 31ms
Termination 0ms
Termination.RecCheck 66,175ms
Positivity 212,551ms
Injectivity 15ms
ProjectionLikeness 0ms
Coverage 15ms
Highlighting 0ms
Serialization 1,482ms
Serialization.Sort 124ms
Serialization.BinaryEncode 124ms
Serialization.Compress 0ms
DeadCode 261,473ms
Checked with todays master. Is this expected behaviour?
[heyting.agda.txt](https://github.com/agda/agda/files/175645/heyting.agda.txt)
| True | Possible performance problem - Running the attached file with the definition
```
∨-cong {x₁} {x₂} {y₁} {y₂} (mk x₁→x₂ , mk x₂→x₁) (mk y₁→y₂ , mk y₂→y₁) = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (x₁→x₂ x₁)
lemma₁ (∨-inr y₁) = ∨-inr (y₁→y₂ y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (x₂→x₁ x₂)
lemma₂ (∨-inr y₂) = ∨-inr (y₂→y₁ y₂)
```
results in
Total 34,211ms
Miscellaneous 15ms
Parsing.Operators 156ms
Deserialization 0ms
Scoping 46ms
Scoping.InverseScopeLookup 18,657ms
Typing 3,088ms
Typing.OccursCheck 468ms
Typing.CheckLHS 5,662ms
Typing.CheckLHS.UnifyIndices 436ms
Termination 0ms
Termination.RecCheck 265ms
Positivity 1,326ms
Injectivity 0ms
ProjectionLikeness 31ms
Coverage 327ms
Highlighting 15ms
Serialization 1,528ms
Serialization.Sort 15ms
Serialization.BinaryEncode 187ms
Serialization.Compress 0ms
DeadCode 1,981ms
Changing the definition to
```
∨-cong {x₁} {x₂} {y₁} {y₂} (x₁⇒x₂ , x₂⇒x₁) (y₁⇒y₂ , y₂⇒y₁) = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (⇒-fst x₁⇒x₂ x₁)
lemma₁ (∨-inr y₁) = ∨-inr (⇒-fst y₁⇒y₂ y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (⇒-fst x₂⇒x₁ x₂)
lemma₂ (∨-inr y₂) = ∨-inr (⇒-fst y₂⇒y₁ y₂)
```
changes the profile to
Total 149,963ms
Miscellaneous 15ms
Parsing.Operators 140ms
Deserialization 0ms
Scoping 62ms
Scoping.InverseScopeLookup 18,922ms
Typing 22,058ms
Typing.OccursCheck 17,799ms
Typing.CheckLHS 30,778ms
Typing.CheckLHS.UnifyIndices 2,386ms
Termination 0ms
Termination.RecCheck 4,071ms
Positivity 19,172ms
Injectivity 0ms
ProjectionLikeness 31ms
Coverage 2,215ms
Highlighting 15ms
Serialization 3,010ms
Serialization.Sort 15ms
Serialization.BinaryEncode 218ms
Serialization.Compress 0ms
DeadCode 29,047ms
When using
```
∨-cong {x₁} {x₂} {y₁} {y₂} x₁≈x₂ y₁≈y₂ = mk lemma₁ , mk lemma₂ where
lemma₁ : x₁ ∨ y₁ → x₂ ∨ y₂
lemma₁ (∨-inl x₁) = ∨-inl (⇒-fst (⇔-fst x₁≈x₂) x₁)
lemma₁ (∨-inr y₁) = ∨-inr (⇒-fst (⇔-fst y₁≈y₂) y₁)
lemma₂ : x₂ ∨ y₂ → x₁ ∨ y₁
lemma₂ (∨-inl x₂) = ∨-inl (⇒-fst (⇔-snd x₁≈x₂) x₂)
lemma₂ (∨-inr y₂) = ∨-inr (⇒-fst (⇔-snd y₁≈y₂) y₂)
```
instead the profile changes to
Total 1,143,471ms
Miscellaneous 0ms
Parsing.Operators 140ms
Deserialization 0ms
Scoping 78ms
Scoping.InverseScopeLookup 18,907ms
Typing 261,395ms
Typing.OccursCheck 7,534ms
Typing.CheckLHS 313,421ms
Typing.CheckLHS.UnifyIndices 31ms
Termination 0ms
Termination.RecCheck 66,175ms
Positivity 212,551ms
Injectivity 15ms
ProjectionLikeness 0ms
Coverage 15ms
Highlighting 0ms
Serialization 1,482ms
Serialization.Sort 124ms
Serialization.BinaryEncode 124ms
Serialization.Compress 0ms
DeadCode 261,473ms
Checked with todays master. Is this expected behaviour?
[heyting.agda.txt](https://github.com/agda/agda/files/175645/heyting.agda.txt)
| non_main | possible performance problem running the attached file with the definition ∨ cong x₁ x₂ y₁ y₂ mk x₁→x₂ mk x₂→x₁ mk y₁→y₂ mk y₂→y₁ mk lemma₁ mk lemma₂ where lemma₁ x₁ ∨ y₁ → x₂ ∨ y₂ lemma₁ ∨ inl x₁ ∨ inl x₁→x₂ x₁ lemma₁ ∨ inr y₁ ∨ inr y₁→y₂ y₁ lemma₂ x₂ ∨ y₂ → x₁ ∨ y₁ lemma₂ ∨ inl x₂ ∨ inl x₂→x₁ x₂ lemma₂ ∨ inr y₂ ∨ inr y₂→y₁ y₂ results in total miscellaneous parsing operators deserialization scoping scoping inversescopelookup typing typing occurscheck typing checklhs typing checklhs unifyindices termination termination reccheck positivity injectivity projectionlikeness coverage highlighting serialization serialization sort serialization binaryencode serialization compress deadcode changing the definition to ∨ cong x₁ x₂ y₁ y₂ x₁⇒x₂ x₂⇒x₁ y₁⇒y₂ y₂⇒y₁ mk lemma₁ mk lemma₂ where lemma₁ x₁ ∨ y₁ → x₂ ∨ y₂ lemma₁ ∨ inl x₁ ∨ inl ⇒ fst x₁⇒x₂ x₁ lemma₁ ∨ inr y₁ ∨ inr ⇒ fst y₁⇒y₂ y₁ lemma₂ x₂ ∨ y₂ → x₁ ∨ y₁ lemma₂ ∨ inl x₂ ∨ inl ⇒ fst x₂⇒x₁ x₂ lemma₂ ∨ inr y₂ ∨ inr ⇒ fst y₂⇒y₁ y₂ changes the profile to total miscellaneous parsing operators deserialization scoping scoping inversescopelookup typing typing occurscheck typing checklhs typing checklhs unifyindices termination termination reccheck positivity injectivity projectionlikeness coverage highlighting serialization serialization sort serialization binaryencode serialization compress deadcode when using ∨ cong x₁ x₂ y₁ y₂ x₁≈x₂ y₁≈y₂ mk lemma₁ mk lemma₂ where lemma₁ x₁ ∨ y₁ → x₂ ∨ y₂ lemma₁ ∨ inl x₁ ∨ inl ⇒ fst ⇔ fst x₁≈x₂ x₁ lemma₁ ∨ inr y₁ ∨ inr ⇒ fst ⇔ fst y₁≈y₂ y₁ lemma₂ x₂ ∨ y₂ → x₁ ∨ y₁ lemma₂ ∨ inl x₂ ∨ inl ⇒ fst ⇔ snd x₁≈x₂ x₂ lemma₂ ∨ inr y₂ ∨ inr ⇒ fst ⇔ snd y₁≈y₂ y₂ instead the profile changes to total miscellaneous parsing operators deserialization scoping scoping inversescopelookup typing typing occurscheck typing checklhs typing checklhs unifyindices termination termination reccheck positivity injectivity projectionlikeness coverage highlighting serialization serialization sort serialization binaryencode serialization compress deadcode checked with todays master is this expected behaviour | 0 |
2,061 | 6,980,064,968 | IssuesEvent | 2017-12-12 23:35:46 | tgstation/tgstation-server | https://api.github.com/repos/tgstation/tgstation-server | closed | ControlPanel panels should each override TabPage instead of being a massive partial class | Backlog Maintainability Issue | God I hate myself when I first write things | True | ControlPanel panels should each override TabPage instead of being a massive partial class - God I hate myself when I first write things | main | controlpanel panels should each override tabpage instead of being a massive partial class god i hate myself when i first write things | 1 |
54,872 | 11,339,592,820 | IssuesEvent | 2020-01-23 02:39:01 | flutter/flutter | https://api.github.com/repos/flutter/flutter | closed | Flutter run hangs on "Running XCode build" | severe: crash t: xcode tool ⌺ platform-ios | I am unable to build for iOS.
`flutter run --debug` always hangs on "Running XCode build".
This is the last output when I ran `flutter --verbose run --debug`
```bash
[ +6 ms] Running Xcode build...
[ ] executing: [/Users/aryan/Projects/open_con/ios/] /usr/bin/env xcrun xcodebuild
-configuration Debug VERBOSE_SCRIPT_LOGGING=YES -workspace Runner.xcworkspace -scheme
Runner BUILD_DIR=/Users/aryan/Projects/open_con/build/ios -sdk iphoneos
ONLY_ACTIVE_ARCH=YES ARCHS=arm64
SCRIPT_OUTPUT_STREAM_FILE=/var/folders/vj/3bhykqjd2dz6cg8yqxr5y3rh0000gp/T/flutter_build_l
og_pipe.K5fv11/pipe_to_stdout FLUTTER_SUPPRESS_ANALYTICS=true
COMPILER_INDEX_STORE_ENABLE=NO
```
It seems to get stuck at this step.
| 1.0 | Flutter run hangs on "Running XCode build" - I am unable to build for iOS.
`flutter run --debug` always hangs on "Running XCode build".
This is the last output when I ran `flutter --verbose run --debug`
```bash
[ +6 ms] Running Xcode build...
[ ] executing: [/Users/aryan/Projects/open_con/ios/] /usr/bin/env xcrun xcodebuild
-configuration Debug VERBOSE_SCRIPT_LOGGING=YES -workspace Runner.xcworkspace -scheme
Runner BUILD_DIR=/Users/aryan/Projects/open_con/build/ios -sdk iphoneos
ONLY_ACTIVE_ARCH=YES ARCHS=arm64
SCRIPT_OUTPUT_STREAM_FILE=/var/folders/vj/3bhykqjd2dz6cg8yqxr5y3rh0000gp/T/flutter_build_l
og_pipe.K5fv11/pipe_to_stdout FLUTTER_SUPPRESS_ANALYTICS=true
COMPILER_INDEX_STORE_ENABLE=NO
```
It seems to get stuck at this step.
| non_main | flutter run hangs on running xcode build i am unable to build for ios flutter run debug always hangs on running xcode build this is the last output when i ran flutter verbose run debug bash running xcode build executing usr bin env xcrun xcodebuild configuration debug verbose script logging yes workspace runner xcworkspace scheme runner build dir users aryan projects open con build ios sdk iphoneos only active arch yes archs script output stream file var folders vj t flutter build l og pipe pipe to stdout flutter suppress analytics true compiler index store enable no it seems to get stuck at this step | 0 |
98,727 | 30,087,303,749 | IssuesEvent | 2023-06-29 09:34:55 | ToolJet/ToolJet | https://api.github.com/repos/ToolJet/ToolJet | closed | [link component] `switch page` event redirects to dashboard | bug roadmap appbuilder | ### What is the current behaviour?
When the **link target** is blank and an event handler is added to `switch page` then on clicking it should switch to another page but it is redirecting to tooljet dashboard.
### How to reproduce the issue?
1. Add a link component
2. Leave the `link target` field empty
3. Add a new page on the app
4. Add an event handler on the link component to switch the page on on click
5. Now, when the link component is clicked the app is redirected to the tooljet dashboard
### Screenshots or Screencast


| 1.0 | [link component] `switch page` event redirects to dashboard - ### What is the current behaviour?
When the **link target** is blank and an event handler is added to `switch page` then on clicking it should switch to another page but it is redirecting to tooljet dashboard.
### How to reproduce the issue?
1. Add a link component
2. Leave the `link target` field empty
3. Add a new page on the app
4. Add an event handler on the link component to switch the page on on click
5. Now, when the link component is clicked the app is redirected to the tooljet dashboard
### Screenshots or Screencast


| non_main | switch page event redirects to dashboard what is the current behaviour when the link target is blank and an event handler is added to switch page then on clicking it should switch to another page but it is redirecting to tooljet dashboard how to reproduce the issue add a link component leave the link target field empty add a new page on the app add an event handler on the link component to switch the page on on click now when the link component is clicked the app is redirected to the tooljet dashboard screenshots or screencast | 0 |
3,494 | 13,634,897,025 | IssuesEvent | 2020-09-25 01:15:08 | amyjko/faculty | https://api.github.com/repos/amyjko/faculty | closed | Symbolic publication sources | maintainability | This will:
* Shorten the data file
* Ensure consistent spelling
* Use them for filters
* Link to information about them | True | Symbolic publication sources - This will:
* Shorten the data file
* Ensure consistent spelling
* Use them for filters
* Link to information about them | main | symbolic publication sources this will shorten the data file ensure consistent spelling use them for filters link to information about them | 1 |
469,677 | 13,523,103,224 | IssuesEvent | 2020-09-15 09:28:55 | Domiii/dbux | https://api.github.com/repos/Domiii/dbux | opened | Missing callId on CallExpression that is also an argument | bug instrumentation priority | Again... sigh...
`a.b.f(1)` is missing a `callId` in `a.i(a.g().h(2), a.b.f(1));` in `calls1.js`. | 1.0 | Missing callId on CallExpression that is also an argument - Again... sigh...
`a.b.f(1)` is missing a `callId` in `a.i(a.g().h(2), a.b.f(1));` in `calls1.js`. | non_main | missing callid on callexpression that is also an argument again sigh a b f is missing a callid in a i a g h a b f in js | 0 |
2,973 | 10,699,046,506 | IssuesEvent | 2019-10-23 20:03:56 | 18F/cg-product | https://api.github.com/repos/18F/cg-product | closed | Pipelines need refactoring to use `in_parallel` instead of `aggregate` | contractor-3-maintainability operations | ## Requirements
In order to continue having our pipelines be current and supported, the `aggregate` keyword must be replaced with the `in_parallel` keyword.
## Acceptance Criteria
*GIVEN* an operator runs `fly get-pipeline -p <pipeline>`
*WHEN* they search for the `aggregate` key
*THEN* no results are YAML keys. | True | Pipelines need refactoring to use `in_parallel` instead of `aggregate` - ## Requirements
In order to continue having our pipelines be current and supported, the `aggregate` keyword must be replaced with the `in_parallel` keyword.
## Acceptance Criteria
*GIVEN* an operator runs `fly get-pipeline -p <pipeline>`
*WHEN* they search for the `aggregate` key
*THEN* no results are YAML keys. | main | pipelines need refactoring to use in parallel instead of aggregate requirements in order to continue having our pipelines be current and supported the aggregate keyword must be replaced with the in parallel keyword acceptance criteria given an operator runs fly get pipeline p when they search for the aggregate key then no results are yaml keys | 1 |
4,358 | 22,041,313,401 | IssuesEvent | 2022-05-29 11:58:06 | BioArchLinux/Packages | https://api.github.com/repos/BioArchLinux/Packages | closed | [MAINTAIN] r-interp: compile error | maintain | <!--
Please report the error of one package in one issue! Use multi issues to report multi bugs.
Thanks!
-->
**Log of the bug**
<details>
```
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/MatrixBase.h:48:34: required from ‘class Eigen::MatrixBase<Eigen::Matrix<double, -1, 1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/PlainObjectBase.h:98:7: required from ‘class Eigen::PlainObjectBase<Eigen::Matrix<double, -1, 1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/Matrix.h:178:7: required from ‘class Eigen::Matrix<double, -1, 1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/QR/ColPivHouseholderQR.h:68:124: required from ‘class Eigen::ColPivHouseholderQR<Eigen::Matrix<double, -1, -1> >’
interp.h:25:15: required from here
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:55:30: warning: ignoring attributes on template argument ‘Eigen::internal::packet_traits<double>::type’ {aka ‘__m128d’} [-Wignored-attributes]
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h: In instantiation of ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 0>’:
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:300:7: required from ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:551:7: required from ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 3>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseBase.h:41:34: required from ‘class Eigen::DenseBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/MatrixBase.h:48:34: required from ‘class Eigen::MatrixBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/PlainObjectBase.h:98:7: required from ‘class Eigen::PlainObjectBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/Matrix.h:178:7: required from ‘class Eigen::Matrix<double, 1, -1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/QR/ColPivHouseholderQR.h:438:19: required from ‘class Eigen::ColPivHouseholderQR<Eigen::Matrix<double, -1, -1> >’
interp.h:25:15: required from here
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:55:30: warning: ignoring attributes on template argument ‘Eigen::internal::packet_traits<double>::type’ {aka ‘__m128d’} [-Wignored-attributes]
ERROR: compilation failed for package ‘interp’
* removing ‘/build/r-interp/src/interp’
* restoring previous ‘/build/r-interp/src/interp’
```
</details>
**Packages (please complete the following information):**
- Package Name: [e.g. iqtree]
**Description**
Add any other context about the problem here.
| True | [MAINTAIN] r-interp: compile error - <!--
Please report the error of one package in one issue! Use multi issues to report multi bugs.
Thanks!
-->
**Log of the bug**
<details>
```
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/MatrixBase.h:48:34: required from ‘class Eigen::MatrixBase<Eigen::Matrix<double, -1, 1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/PlainObjectBase.h:98:7: required from ‘class Eigen::PlainObjectBase<Eigen::Matrix<double, -1, 1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/Matrix.h:178:7: required from ‘class Eigen::Matrix<double, -1, 1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/QR/ColPivHouseholderQR.h:68:124: required from ‘class Eigen::ColPivHouseholderQR<Eigen::Matrix<double, -1, -1> >’
interp.h:25:15: required from here
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:55:30: warning: ignoring attributes on template argument ‘Eigen::internal::packet_traits<double>::type’ {aka ‘__m128d’} [-Wignored-attributes]
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h: In instantiation of ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 0>’:
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:300:7: required from ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:551:7: required from ‘class Eigen::DenseCoeffsBase<Eigen::Matrix<double, 1, -1>, 3>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseBase.h:41:34: required from ‘class Eigen::DenseBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/MatrixBase.h:48:34: required from ‘class Eigen::MatrixBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/PlainObjectBase.h:98:7: required from ‘class Eigen::PlainObjectBase<Eigen::Matrix<double, 1, -1> >’
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/Matrix.h:178:7: required from ‘class Eigen::Matrix<double, 1, -1>’
/usr/lib/R/library/RcppEigen/include/Eigen/src/QR/ColPivHouseholderQR.h:438:19: required from ‘class Eigen::ColPivHouseholderQR<Eigen::Matrix<double, -1, -1> >’
interp.h:25:15: required from here
/usr/lib/R/library/RcppEigen/include/Eigen/src/Core/DenseCoeffsBase.h:55:30: warning: ignoring attributes on template argument ‘Eigen::internal::packet_traits<double>::type’ {aka ‘__m128d’} [-Wignored-attributes]
ERROR: compilation failed for package ‘interp’
* removing ‘/build/r-interp/src/interp’
* restoring previous ‘/build/r-interp/src/interp’
```
</details>
**Packages (please complete the following information):**
- Package Name: [e.g. iqtree]
**Description**
Add any other context about the problem here.
| main | r interp compile error please report the error of one package in one issue use multi issues to report multi bugs thanks log of the bug usr lib r library rcppeigen include eigen src core matrixbase h required from ‘class eigen matrixbase ’ usr lib r library rcppeigen include eigen src core plainobjectbase h required from ‘class eigen plainobjectbase ’ usr lib r library rcppeigen include eigen src core matrix h required from ‘class eigen matrix ’ usr lib r library rcppeigen include eigen src qr colpivhouseholderqr h required from ‘class eigen colpivhouseholderqr ’ interp h required from here usr lib r library rcppeigen include eigen src core densecoeffsbase h warning ignoring attributes on template argument ‘eigen internal packet traits type’ aka ‘ ’ usr lib r library rcppeigen include eigen src core densecoeffsbase h in instantiation of ‘class eigen densecoeffsbase ’ usr lib r library rcppeigen include eigen src core densecoeffsbase h required from ‘class eigen densecoeffsbase ’ usr lib r library rcppeigen include eigen src core densecoeffsbase h required from ‘class eigen densecoeffsbase ’ usr lib r library rcppeigen include eigen src core densebase h required from ‘class eigen densebase ’ usr lib r library rcppeigen include eigen src core matrixbase h required from ‘class eigen matrixbase ’ usr lib r library rcppeigen include eigen src core plainobjectbase h required from ‘class eigen plainobjectbase ’ usr lib r library rcppeigen include eigen src core matrix h required from ‘class eigen matrix ’ usr lib r library rcppeigen include eigen src qr colpivhouseholderqr h required from ‘class eigen colpivhouseholderqr ’ interp h required from here usr lib r library rcppeigen include eigen src core densecoeffsbase h warning ignoring attributes on template argument ‘eigen internal packet traits type’ aka ‘ ’ error compilation failed for package ‘interp’ removing ‘ build r interp src interp’ restoring previous ‘ build r interp src interp’ packages please complete the following information package name description add any other context about the problem here | 1 |
241,065 | 7,808,644,060 | IssuesEvent | 2018-06-11 20:52:16 | otrv4/libotr-ng | https://api.github.com/repos/otrv4/libotr-ng | closed | Check the usage of booleans | C-syntax discuss medium-priority | ## Why
A library should be sufficiently consistent in the ways it is written.
## Reference
21st Century C, Chapter 8, "True and False", page 147.
## Discussion
When should be use bool.h? Are errors behaving the same as booleans?
## Tasks
- [x] Discuss when should we use bool.h
- [x] Discuss how errors are been used and if they should have a similar "boolean" behavior
- [x] Check that if statements are checking by `if (x)` and not `if(x == true)`.
## Open questions
- The issue itself. | 1.0 | Check the usage of booleans - ## Why
A library should be sufficiently consistent in the ways it is written.
## Reference
21st Century C, Chapter 8, "True and False", page 147.
## Discussion
When should be use bool.h? Are errors behaving the same as booleans?
## Tasks
- [x] Discuss when should we use bool.h
- [x] Discuss how errors are been used and if they should have a similar "boolean" behavior
- [x] Check that if statements are checking by `if (x)` and not `if(x == true)`.
## Open questions
- The issue itself. | non_main | check the usage of booleans why a library should be sufficiently consistent in the ways it is written reference century c chapter true and false page discussion when should be use bool h are errors behaving the same as booleans tasks discuss when should we use bool h discuss how errors are been used and if they should have a similar boolean behavior check that if statements are checking by if x and not if x true open questions the issue itself | 0 |
1,110 | 4,981,808,646 | IssuesEvent | 2016-12-07 09:19:48 | tgstation/tgstation | https://api.github.com/repos/tgstation/tgstation | closed | No Atom/Destroy() | Maintainability - Hinders improvements - Not a bug | Simply put, there's no Atom/Destroy(), when there really should be.
For example, any atom can have reagents in it, but yet qdel()ing of reagents is handled on the atom/movable level. While I know of no turf that has reagents in it, it's still best to handle getting rid of something on the level it's defined at....likewise, it means you could push invisibility = 101 up another level so it applied to more than just atom/movable.
| True | No Atom/Destroy() - Simply put, there's no Atom/Destroy(), when there really should be.
For example, any atom can have reagents in it, but yet qdel()ing of reagents is handled on the atom/movable level. While I know of no turf that has reagents in it, it's still best to handle getting rid of something on the level it's defined at....likewise, it means you could push invisibility = 101 up another level so it applied to more than just atom/movable.
| main | no atom destroy simply put there s no atom destroy when there really should be for example any atom can have reagents in it but yet qdel ing of reagents is handled on the atom movable level while i know of no turf that has reagents in it it s still best to handle getting rid of something on the level it s defined at likewise it means you could push invisibility up another level so it applied to more than just atom movable | 1 |
1,698 | 6,574,375,979 | IssuesEvent | 2017-09-11 12:39:37 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | docker_container succeeds with empty result | affects_2.2 bug_report cloud docker P2 waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_container
##### ANSIBLE VERSION
2.2.0, 2.1.3
##### SUMMARY
The single `ansible_docker_container` key is being scrubbed from the ansible_facts result value, due to the fix for [CVE-2016-3096](http://www.cvedetails.com/cve/CVE-2016-3096/) from issue https://github.com/ansible/ansible/pull/15925. This is due to the return value's fact key having a prefix used for docker connection vars (ansible_docker_).
We probably don't want to add exceptions to the CVE fix, so it seems that the only reasonable fix is a breaking change to docker_container's return value (renaming the fact to `docker_container` or something else, or not returning as a fact.
##### STEPS TO REPRODUCE
Use the docker_container module in any way on affected versions and look at the output- it returns an empty ansible_facts dict instead of the actual module results.
##### EXPECTED RESULTS
Module return value as documented (including the ansible_docker_container subdict and data).
##### ACTUAL RESULTS
Empty ansible_facts dictionary. | True | docker_container succeeds with empty result - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
docker_container
##### ANSIBLE VERSION
2.2.0, 2.1.3
##### SUMMARY
The single `ansible_docker_container` key is being scrubbed from the ansible_facts result value, due to the fix for [CVE-2016-3096](http://www.cvedetails.com/cve/CVE-2016-3096/) from issue https://github.com/ansible/ansible/pull/15925. This is due to the return value's fact key having a prefix used for docker connection vars (ansible_docker_).
We probably don't want to add exceptions to the CVE fix, so it seems that the only reasonable fix is a breaking change to docker_container's return value (renaming the fact to `docker_container` or something else, or not returning as a fact.
##### STEPS TO REPRODUCE
Use the docker_container module in any way on affected versions and look at the output- it returns an empty ansible_facts dict instead of the actual module results.
##### EXPECTED RESULTS
Module return value as documented (including the ansible_docker_container subdict and data).
##### ACTUAL RESULTS
Empty ansible_facts dictionary. | main | docker container succeeds with empty result issue type bug report component name docker container ansible version summary the single ansible docker container key is being scrubbed from the ansible facts result value due to the fix for from issue this is due to the return value s fact key having a prefix used for docker connection vars ansible docker we probably don t want to add exceptions to the cve fix so it seems that the only reasonable fix is a breaking change to docker container s return value renaming the fact to docker container or something else or not returning as a fact steps to reproduce use the docker container module in any way on affected versions and look at the output it returns an empty ansible facts dict instead of the actual module results expected results module return value as documented including the ansible docker container subdict and data actual results empty ansible facts dictionary | 1 |
301,530 | 22,761,571,243 | IssuesEvent | 2022-07-07 21:49:28 | opdev/opcap | https://api.github.com/repos/opdev/opcap | closed | Add instructions on pre-requisites | documentation good first issue | In order to help users get up and running add detailed information on how to authenticate on Red Hat's registry (or provide an alternative way like using custom catalog/local bundle) and how to configure their kubeconfig secret if needed. | 1.0 | Add instructions on pre-requisites - In order to help users get up and running add detailed information on how to authenticate on Red Hat's registry (or provide an alternative way like using custom catalog/local bundle) and how to configure their kubeconfig secret if needed. | non_main | add instructions on pre requisites in order to help users get up and running add detailed information on how to authenticate on red hat s registry or provide an alternative way like using custom catalog local bundle and how to configure their kubeconfig secret if needed | 0 |
83,728 | 15,715,390,789 | IssuesEvent | 2021-03-28 01:01:08 | kadirselcuk/netdata | https://api.github.com/repos/kadirselcuk/netdata | opened | CVE-2021-27292 (High) detected in ua-parser-js-0.7.21.tgz | security vulnerability | ## CVE-2021-27292 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ua-parser-js-0.7.21.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.21.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.21.tgz</a></p>
<p>Path to dependency file: netdata/package.json</p>
<p>Path to vulnerable library: netdata/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- karma-5.0.8.tgz (Root Library)
- :x: **ua-parser-js-0.7.21.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ua-parser-js >= 0.7.14, fixed in 0.7.24, uses a regular expression which is vulnerable to denial of service. If an attacker sends a malicious User-Agent header, ua-parser-js will get stuck processing it for an extended period of time.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27292>CVE-2021-27292</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/faisalman/ua-parser-js/releases/tag/0.7.24">https://github.com/faisalman/ua-parser-js/releases/tag/0.7.24</a></p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution: ua-parser-js - 0.7.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-27292 (High) detected in ua-parser-js-0.7.21.tgz - ## CVE-2021-27292 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ua-parser-js-0.7.21.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.21.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.21.tgz</a></p>
<p>Path to dependency file: netdata/package.json</p>
<p>Path to vulnerable library: netdata/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- karma-5.0.8.tgz (Root Library)
- :x: **ua-parser-js-0.7.21.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ua-parser-js >= 0.7.14, fixed in 0.7.24, uses a regular expression which is vulnerable to denial of service. If an attacker sends a malicious User-Agent header, ua-parser-js will get stuck processing it for an extended period of time.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27292>CVE-2021-27292</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/faisalman/ua-parser-js/releases/tag/0.7.24">https://github.com/faisalman/ua-parser-js/releases/tag/0.7.24</a></p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution: ua-parser-js - 0.7.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve high detected in ua parser js tgz cve high severity vulnerability vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file netdata package json path to vulnerable library netdata node modules ua parser js package json dependency hierarchy karma tgz root library x ua parser js tgz vulnerable library found in base branch master vulnerability details ua parser js fixed in uses a regular expression which is vulnerable to denial of service if an attacker sends a malicious user agent header ua parser js will get stuck processing it for an extended period of time publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ua parser js step up your open source security game with whitesource | 0 |
5,567 | 27,844,550,358 | IssuesEvent | 2023-03-20 14:45:07 | conbench/conbench | https://api.github.com/repos/conbench/conbench | closed | UI: migrate all the way from Bootstrap 3 to Bootstrap 5. And to DataTables 1.13 | UI/UX maintainability | Picking this idea up from https://github.com/conbench/conbench/issues/834: migrate all the way from Bootstrap 3 to Bootstrap 5.
https://github.com/twbs/release
Bootstrap 3 was released August 19, 2013. The 3.4.1 release was on Feb 13, 2019. Bootstrap 3 was EOLd on 2019-07-24.
DataTables 1.10.0 was released on 1st May, 2014. DataTables 1.10.23 (what we use) was released on 18th December, 2020. DataTables. There is 1.10.25 released on4th June, 2021, but 1.10.x seems to be EOLd.
From https://blog.getbootstrap.com/2019/07/24/lts-plan/
> Starting today, Bootstrap 3 will move to end of life, and will no longer receive critical security updates.
This is somewhat a necessity to keep things maintainable, changable.
DataTables related to bootstrap, too. It seems to integrate nicely with Bootstrap 5.
From https://datatables.net/forums/discussion/comment/214642/#Comment_214642 (`Release notes for 1.13.0`):
- `Fix: Bootstrap 5 integration - processing indicator wasn't being correctly positioned`
- `Fix: Bootstrap 5 - keep ul.pagination attributes added by events`
- `Fix: Improvements to accessability for Bootstrap 5 paging control`
The DataTables 1.12.0 release said `Bootstrap v5.1.2 and above compatibility`.
Also generally looking forward to the idea of being able to properly maintain JavaScript (ideally TypeScript) in our code base.
Bootstrap 5 says:
> The biggest change to our JavaScript has been the removal of jQuery, but we’ve also made a number of enhancements beyond that as well.
DataTables 1.11 (2021) was all about:
> The headline feature in this release is that it is now possible to initialise and use DataTables without writing an jQuery code. We still use jQuery as a utility library, so it is still a dependency, but if you prefer to write non-jQuery code, you can now do so with DataTables as well.
(https://cdn.datatables.net/1.11.0/)
Overall, I do think that DataTables 1.13.x and Bootstrap 5 will make a good team.
I also noticed that for modern DataTables there is a plugin that might come in handy for our 'uber search bar': https://github.com/conbench/conbench/issues/844
| True | UI: migrate all the way from Bootstrap 3 to Bootstrap 5. And to DataTables 1.13 - Picking this idea up from https://github.com/conbench/conbench/issues/834: migrate all the way from Bootstrap 3 to Bootstrap 5.
https://github.com/twbs/release
Bootstrap 3 was released August 19, 2013. The 3.4.1 release was on Feb 13, 2019. Bootstrap 3 was EOLd on 2019-07-24.
DataTables 1.10.0 was released on 1st May, 2014. DataTables 1.10.23 (what we use) was released on 18th December, 2020. DataTables. There is 1.10.25 released on4th June, 2021, but 1.10.x seems to be EOLd.
From https://blog.getbootstrap.com/2019/07/24/lts-plan/
> Starting today, Bootstrap 3 will move to end of life, and will no longer receive critical security updates.
This is somewhat a necessity to keep things maintainable, changable.
DataTables related to bootstrap, too. It seems to integrate nicely with Bootstrap 5.
From https://datatables.net/forums/discussion/comment/214642/#Comment_214642 (`Release notes for 1.13.0`):
- `Fix: Bootstrap 5 integration - processing indicator wasn't being correctly positioned`
- `Fix: Bootstrap 5 - keep ul.pagination attributes added by events`
- `Fix: Improvements to accessability for Bootstrap 5 paging control`
The DataTables 1.12.0 release said `Bootstrap v5.1.2 and above compatibility`.
Also generally looking forward to the idea of being able to properly maintain JavaScript (ideally TypeScript) in our code base.
Bootstrap 5 says:
> The biggest change to our JavaScript has been the removal of jQuery, but we’ve also made a number of enhancements beyond that as well.
DataTables 1.11 (2021) was all about:
> The headline feature in this release is that it is now possible to initialise and use DataTables without writing an jQuery code. We still use jQuery as a utility library, so it is still a dependency, but if you prefer to write non-jQuery code, you can now do so with DataTables as well.
(https://cdn.datatables.net/1.11.0/)
Overall, I do think that DataTables 1.13.x and Bootstrap 5 will make a good team.
I also noticed that for modern DataTables there is a plugin that might come in handy for our 'uber search bar': https://github.com/conbench/conbench/issues/844
| main | ui migrate all the way from bootstrap to bootstrap and to datatables picking this idea up from migrate all the way from bootstrap to bootstrap bootstrap was released august the release was on feb bootstrap was eold on datatables was released on may datatables what we use was released on december datatables there is released june but x seems to be eold from starting today bootstrap will move to end of life and will no longer receive critical security updates this is somewhat a necessity to keep things maintainable changable datatables related to bootstrap too it seems to integrate nicely with bootstrap from release notes for fix bootstrap integration processing indicator wasn t being correctly positioned fix bootstrap keep ul pagination attributes added by events fix improvements to accessability for bootstrap paging control the datatables release said bootstrap and above compatibility also generally looking forward to the idea of being able to properly maintain javascript ideally typescript in our code base bootstrap says the biggest change to our javascript has been the removal of jquery but we’ve also made a number of enhancements beyond that as well datatables was all about the headline feature in this release is that it is now possible to initialise and use datatables without writing an jquery code we still use jquery as a utility library so it is still a dependency but if you prefer to write non jquery code you can now do so with datatables as well overall i do think that datatables x and bootstrap will make a good team i also noticed that for modern datatables there is a plugin that might come in handy for our uber search bar | 1 |
1,725 | 6,574,506,483 | IssuesEvent | 2017-09-11 13:08:43 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | ec2_ami state:absent wait:yes should wait for AMI to be removed | affects_2.1 aws cloud feature_idea waiting_on_maintainer | ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
ec2_ami
##### ANSIBLE VERSION
```
ansible 2.1.2.0
```
##### OS / ENVIRONMENT
N/A
##### SUMMARY
`ec2_ami` module with args:
1. `state: absent`
2. `wait: yes`
should wait for AMI to be fully deregistered before continuing.
| True | ec2_ami state:absent wait:yes should wait for AMI to be removed - ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
ec2_ami
##### ANSIBLE VERSION
```
ansible 2.1.2.0
```
##### OS / ENVIRONMENT
N/A
##### SUMMARY
`ec2_ami` module with args:
1. `state: absent`
2. `wait: yes`
should wait for AMI to be fully deregistered before continuing.
| main | ami state absent wait yes should wait for ami to be removed issue type feature idea component name ami ansible version ansible os environment n a summary ami module with args state absent wait yes should wait for ami to be fully deregistered before continuing | 1 |
589,257 | 17,693,310,885 | IssuesEvent | 2021-08-24 12:46:28 | o3de/o3de | https://api.github.com/repos/o3de/o3de | opened | [MacOS] Cmake fails to build on a machine with MacOS Catalina 10.15.5 and Xcode 11.2 | kind/bug needs-sig needs-triage priority/minor | **Describe the bug**
It has been observed that cmake will fail to build when _cmake -E time cmake --build . --target ALL_BUILD --config profile -j 2_
command is used after successful configure on a MacOS machine with macOS Catalina version 10.15.5 and Xcode version 11.2 (11B52) .
If these software specs are below o3de MacOS requirements the according information should be put in o3de documentation.
Please note that MacOS was successfully building on a identical MacBook Pro (15-inch, 2017) machine with MacOS Big Sur 11.3.1 and Xcode 12.5
For further information please refer to the attached build logs.
[Mac building fail on 10.15.5 log.txt](https://github.com/o3de/o3de/files/7039123/Mac.building.fail.on.10.15.5.log.txt)
[CMakeOutput.log](https://github.com/o3de/o3de/files/7039124/CMakeOutput.log)
**To Reproduce**
Steps to reproduce the behavior:
1. Use Github to obtain the latest O3DE build on your Mac.
2. Create a new or move your preexisting 3rdParty folder into the o3de folder.
3. Obtain Cmake.app and move it to created 3rdParty/CMake directory (Create
the CMake folder if necessary)
4. In o3de directory right click on python folder and select New Terminal at Folder
5. Run the: _export PATH=$PATH:/Users/<home folder name>/o3de/3rdParty/CMake/CMake.app/Contents/bin command_.
6. Run _get_python.sh_ command.
7. Create osx_build in the o3de folder.
8. Open new Terminal at created osx_build folder and set the following temporary environment variables:
_export PATH=$PATH:/Users/<home folder name>/o3de/3rdParty/CMake/CMake.app/Contents/bin_
_export LY_3RDPARTY_PATH=/Users/<home folder name>/o3de/3rdParty_
_export LY_PACKAGE_SERVER_URLS=https://d2c171ws20a1rv.cloudfront.net_
9. Configure and generate Xcode solution:
_cmake .. -G "Xcode" -DLY_3RDPARTY_PATH=$LY_3RDPARTY_PATH -DLY_PROJECTS=AutomatedTesting -DLY_UNITY_BUILD=1_
10. Build with Cmake by using:
_cmake -E time cmake --build . --target ALL_BUILD --config profile -j 2_ command.
**Expected behavior**
MacOS machine finishes the building process sucessfully.
**Desktop/Device (please complete the following information):**
* Device: Mac
* OS: MacOS
* Version: Catalina 10.15.5
* CPU: Intel Core i7 2,8 GHz
* GPU: Radeon Pro 555
* Memory: 16GB
| 1.0 | [MacOS] Cmake fails to build on a machine with MacOS Catalina 10.15.5 and Xcode 11.2 - **Describe the bug**
It has been observed that cmake will fail to build when _cmake -E time cmake --build . --target ALL_BUILD --config profile -j 2_
command is used after successful configure on a MacOS machine with macOS Catalina version 10.15.5 and Xcode version 11.2 (11B52) .
If these software specs are below o3de MacOS requirements the according information should be put in o3de documentation.
Please note that MacOS was successfully building on a identical MacBook Pro (15-inch, 2017) machine with MacOS Big Sur 11.3.1 and Xcode 12.5
For further information please refer to the attached build logs.
[Mac building fail on 10.15.5 log.txt](https://github.com/o3de/o3de/files/7039123/Mac.building.fail.on.10.15.5.log.txt)
[CMakeOutput.log](https://github.com/o3de/o3de/files/7039124/CMakeOutput.log)
**To Reproduce**
Steps to reproduce the behavior:
1. Use Github to obtain the latest O3DE build on your Mac.
2. Create a new or move your preexisting 3rdParty folder into the o3de folder.
3. Obtain Cmake.app and move it to created 3rdParty/CMake directory (Create
the CMake folder if necessary)
4. In o3de directory right click on python folder and select New Terminal at Folder
5. Run the: _export PATH=$PATH:/Users/<home folder name>/o3de/3rdParty/CMake/CMake.app/Contents/bin command_.
6. Run _get_python.sh_ command.
7. Create osx_build in the o3de folder.
8. Open new Terminal at created osx_build folder and set the following temporary environment variables:
_export PATH=$PATH:/Users/<home folder name>/o3de/3rdParty/CMake/CMake.app/Contents/bin_
_export LY_3RDPARTY_PATH=/Users/<home folder name>/o3de/3rdParty_
_export LY_PACKAGE_SERVER_URLS=https://d2c171ws20a1rv.cloudfront.net_
9. Configure and generate Xcode solution:
_cmake .. -G "Xcode" -DLY_3RDPARTY_PATH=$LY_3RDPARTY_PATH -DLY_PROJECTS=AutomatedTesting -DLY_UNITY_BUILD=1_
10. Build with Cmake by using:
_cmake -E time cmake --build . --target ALL_BUILD --config profile -j 2_ command.
**Expected behavior**
MacOS machine finishes the building process sucessfully.
**Desktop/Device (please complete the following information):**
* Device: Mac
* OS: MacOS
* Version: Catalina 10.15.5
* CPU: Intel Core i7 2,8 GHz
* GPU: Radeon Pro 555
* Memory: 16GB
| non_main | cmake fails to build on a machine with macos catalina and xcode describe the bug it has been observed that cmake will fail to build when cmake e time cmake build target all build config profile j command is used after successful configure on a macos machine with macos catalina version and xcode version if these software specs are below macos requirements the according information should be put in documentation please note that macos was successfully building on a identical macbook pro inch machine with macos big sur and xcode for further information please refer to the attached build logs to reproduce steps to reproduce the behavior use github to obtain the latest build on your mac create a new or move your preexisting folder into the folder obtain cmake app and move it to created cmake directory create the cmake folder if necessary in directory right click on python folder and select new terminal at folder run the export path path users cmake cmake app contents bin command run get python sh command create osx build in the folder open new terminal at created osx build folder and set the following temporary environment variables export path path users cmake cmake app contents bin export ly path users export ly package server urls configure and generate xcode solution cmake g xcode dly path ly path dly projects automatedtesting dly unity build build with cmake by using cmake e time cmake build target all build config profile j command expected behavior macos machine finishes the building process sucessfully desktop device please complete the following information device mac os macos version catalina cpu intel core ghz gpu radeon pro memory | 0 |
3,063 | 11,464,380,410 | IssuesEvent | 2020-02-07 17:56:40 | DynamoRIO/dynamorio | https://api.github.com/repos/DynamoRIO/dynamorio | closed | Adopt a consistent class member variable naming style | Maintainability | We hadn't codified a style for C++ class member variable naming, and as a result we now have mixed styles: some have underscores and some do not. I arbitrarily decided to go with underscores. I updated the style conventions in https://github.com/DynamoRIO/dynamorio/wiki/Code-Style and will update the code.
| True | Adopt a consistent class member variable naming style - We hadn't codified a style for C++ class member variable naming, and as a result we now have mixed styles: some have underscores and some do not. I arbitrarily decided to go with underscores. I updated the style conventions in https://github.com/DynamoRIO/dynamorio/wiki/Code-Style and will update the code.
| main | adopt a consistent class member variable naming style we hadn t codified a style for c class member variable naming and as a result we now have mixed styles some have underscores and some do not i arbitrarily decided to go with underscores i updated the style conventions in and will update the code | 1 |
28,175 | 4,367,042,567 | IssuesEvent | 2016-08-03 15:58:47 | ngageoint/hootenanny | https://api.github.com/repos/ngageoint/hootenanny | closed | Update Review Bookmarks Cucumber Test | Category: Test | Based on https://github.com/ngageoint/hootenanny-ui/issues/484, the changes to the Review Bookmarks will require changes to review bookmarks cucumber test. | 1.0 | Update Review Bookmarks Cucumber Test - Based on https://github.com/ngageoint/hootenanny-ui/issues/484, the changes to the Review Bookmarks will require changes to review bookmarks cucumber test. | non_main | update review bookmarks cucumber test based on the changes to the review bookmarks will require changes to review bookmarks cucumber test | 0 |
2,611 | 8,853,380,366 | IssuesEvent | 2019-01-08 21:11:21 | dgets/lasttime | https://api.github.com/repos/dgets/lasttime | closed | Move as much javascript to the external file as possible | maintainability | The _data_summary.html_ template is getting a little nasty; relocate any named function definitions to the external JS file that's loaded. | True | Move as much javascript to the external file as possible - The _data_summary.html_ template is getting a little nasty; relocate any named function definitions to the external JS file that's loaded. | main | move as much javascript to the external file as possible the data summary html template is getting a little nasty relocate any named function definitions to the external js file that s loaded | 1 |
404,573 | 11,859,454,652 | IssuesEvent | 2020-03-25 13:24:27 | FAIRsharing/fairsharing.github.io | https://api.github.com/repos/FAIRsharing/fairsharing.github.io | opened | Add order to records.vue | Normal priority enhancement | Allow the records search to order by using graphql query getRecords.json | 1.0 | Add order to records.vue - Allow the records search to order by using graphql query getRecords.json | non_main | add order to records vue allow the records search to order by using graphql query getrecords json | 0 |
1,441 | 6,262,198,536 | IssuesEvent | 2017-07-15 08:01:36 | ocaml/opam-repository | https://api.github.com/repos/ocaml/opam-repository | closed | opam fails to install ocaml-top | bug needs maintainer action | Mac version 10.12.5
Hello fail to install
zarith
ocaml-top
----------------------------------------------------------------------------
i was said at https://github.com/ocaml/opam/issues/2992
...Regarding the ocaml-top error please follow the advice given in the error message:
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the gtksourceview' andlibxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
-------------------
OK I have tried
```
Opam install gtksourceview
[ERROR] No package named gtksourceview found.
opam list -a | grep -i "GtkSourceView"
conf-gtksourceview -- Virtual package relying on a G
******************$ sudo Opam install conf-gtksourceview
Password:
[WARNING] Running as root is not recommended
The following actions will be performed:
∗ install conf-gtksourceview 2
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[ERROR] The compilation of conf-gtksourceview failed at "pkg-config --short-errors --print-errors gtksourceview-2.0".
#=== ERROR while installing conf-gtksourceview.2 ==============================#
# opam-version 1.2.2
# os darwin
# command pkg-config --short-errors --print-errors gtksourceview-2.0
# path /Users/****/.opam/system/build/conf-gtksourceview.2
# compiler system (4.04.2)
# exit-code 127
# env-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.env
# stdout-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.out
# stderr-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.err
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions failed
∗ install conf-gtksourceview 2
No changes have been performed
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the `gtksourceview' and `libxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
----------------------------------------------------------------------------
Before
i ihave done
brew install opam
opam init
opam install zarith
opam install ocaml-top
Here the report
Somebody knows the issue ?
Thanks
*******:ocaml ****$ opam init
Checking for available remotes: rsync and local, git.
you won't be able to use mercurial repositories unless you install the hg command on your system.
you won't be able to use darcs repositories unless you install the darcs command on your system.
=-=- Fetching repository information =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] synchronized from https://opam.ocaml.org
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed base-bigarray.base
∗ installed base-threads.base
∗ installed base-unix.base
Done.
In normal operation, OPAM only alters files within ~/.opam.
During this initialisation, you can allow OPAM to add information to two
other files for best results. You can also make these additions manually
if you wish.
If you agree, OPAM will modify:
~/.bash_profile (or a file you specify) to set the right environment
variables and to load the auto-completion scripts for your shell (bash)
on startup. Specifically, it checks for and appends the following line:
. /Users/****/.opam/opam-init/init.sh > /dev/null 2> /dev/null || true
~/.ocamlinit to ensure that non-system installations of ocamlfind
(i.e. those installed by OPAM) will work correctly when running the
OCaml toplevel. It does this by adding $OCAML_TOPLEVEL_PATH to the list
of include directories.
If you choose to not configure your system now, you can either configure
OPAM manually (instructions will be displayed) or launch the automatic setup
later by running:
opam config setup -a
Do you want OPAM to modify ~/.bash_profile and ~/.ocamlinit?
(default is 'no', use 'f' to name a file other than ~/.bash_profile)
[N/y/f]
Global configuration:
Updating ~/.opam/opam-init/init.sh
Updating ~/.opam/opam-init/init.zsh
Updating ~/.opam/opam-init/init.csh
Updating ~/.opam/opam-init/init.fish
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
To configure OPAM in the current shell session, you need to run:
eval opam config env
To correctly configure OPAM for subsequent use, add the following
line to your profile file (for instance ~/.bash_profile):
. /Users/****/.opam/opam-init/init.sh > /dev/null 2> /dev/null || true
To avoid issues related to non-system installations of ocamlfind
add the following lines to ~/.ocamlinit (create it if necessary):
let () =
try Topdirs.dir_directory (Sys.getenv "OCAML_TOPLEVEL_PATH")
with Not_found -> ()
;;
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
*******:ocaml ****$ opam --version
1.2.2
*******:ocaml ****$ opam install zarith
The following actions will be performed:
∗ install conf-perl 1 [required by zarith]
∗ install conf-m4 1 [required by ocamlfind]
∗ install conf-gmp 1 [required by zarith]
∗ install ocamlfind 1.7.3 [required by zarith]
∗ install zarith 1.5
===== ∗ 5 =====
Do you want to continue ? [Y/n] y
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] https://opam.ocaml.org/archives/zarith.1.5+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocamlfind.1.7.3+opam.tar.gz downloaded
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed conf-perl.1
[ERROR] The compilation of conf-gmp failed at "sh -exc cc -c $CFLAGS -I/usr/local/include test.c".
∗ installed conf-m4.1
∗ installed ocamlfind.1.7.3
#=== ERROR while installing conf-gmp.1 ========================================#
opam-version 1.2.2
os darwin
command sh -exc cc -c $CFLAGS -I/usr/local/include test.c
path /Users/****/.opam/system/build/conf-gmp.1
compiler system (4.04.2)
exit-code 1
env-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.env
stdout-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.out
stderr-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.err
stderr
+ cc -c -I/usr/local/include test.c
test.c:1:10: fatal error: 'gmp.h' file not found
#include <gmp.h>
^
1 error generated.
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions were aborted
∗ install zarith 1.5
The following actions failed
∗ install conf-gmp 1
The following changes have been performed
∗ install conf-m4 1
∗ install conf-perl 1
∗ install ocamlfind 1.7.3
=-=- conf-gmp.1 troobleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package relies on external (system) dependencies that may be missing. `opam depext conf-gmp.1' may help you find the correct installation for your system.
The former state can be restored with:
opam switch import "~/.opam/system/backup/state-20170604184735.export"
Last login: Wed Jul 5 11:29:19 on ttys000
*******:~ ****$ opam install ocaml-top
The following actions will be performed:
∗ install conf-gtksourceview 2 [required by ocaml-top]
∗ install ocp-build 1.99.19-beta [required by ocaml-top]
∗ install result 1.2 [required by cmdliner]
∗ install base-bytes base [required by re]
∗ install ocamlbuild 0.11.0 [required by cmdliner, re]
∗ install lablgtk 2.18.5 [required by ocaml-top]
∗ install topkg 0.9.0 [required by cmdliner]
∗ install re 1.7.1 [required by ocp-index]
∗ install cmdliner 1.0.0 [required by ocp-indent, ocp-index]
∗ install ocp-indent 1.6.0 [required by ocaml-top]
∗ install ocp-index 1.1.5 [required by ocaml-top]
For ocp-browser, please also install package lambda-term
∗ install ocaml-top 1.1.3
===== ∗ 12 =====
Do you want to continue ? [Y/n] y
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] https://opam.ocaml.org/archives/cmdliner.1.0.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocaml-top.1.1.3+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocamlbuild.0.11.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocp-build.1.99.19-beta+opam.tar.gz downloaded
Processing 9/12: [lablgtk: from default] [ocp-indent: from default] [ocp-index: from default[default] https://opam.ocaml.org/archives/ocp-indent.1.6.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/lablgtk.2.18.5+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/result.1.2+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/re.1.7.1+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/topkg.0.9.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocp-index.1.1.5+opam.tar.gz downloaded
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed base-bytes.base
[ERROR] The compilation of conf-gtksourceview failed at "pkg-config --short-errors --print-errors gtksourceview-2.0".
∗ installed result.1.2
∗ installed ocamlbuild.0.11.0
∗ installed topkg.0.9.0
∗ installed cmdliner.1.0.0
∗ installed re.1.7.1
∗ installed ocp-build.1.99.19-beta
∗ installed ocp-indent.1.6.0
∗ installed ocp-index.1.1.5
#=== ERROR while installing conf-gtksourceview.2 ==============================#
opam-version 1.2.2
os darwin
command pkg-config --short-errors --print-errors gtksourceview-2.0
path /Users/****/.opam/system/build/conf-gtksourceview.2
compiler system (4.04.2)
exit-code 127
env-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.env
stdout-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.out
stderr-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.err
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions were aborted
∗ install lablgtk 2.18.5
∗ install ocaml-top 1.1.3
The following actions failed
∗ install conf-gtksourceview 2
The following changes have been performed
∗ install base-bytes base
∗ install cmdliner 1.0.0
∗ install ocamlbuild 0.11.0
∗ install ocp-build 1.99.19-beta
∗ install ocp-indent 1.6.0
∗ install ocp-index 1.1.5
∗ install re 1.7.1
∗ install result 1.2
∗ install topkg 0.9.0
=-=- ocp-indent.1.6.0 installed successfully =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires additional configuration for use in editors. Install package 'user-setup', or manually:
for Emacs, add these lines to ~/.emacs:
(add-to-list 'load-path "/Users/****/.opam/system/share/emacs/site-lisp")
(require 'ocp-indent)
for Vim, add this line to ~/.vimrc:
set rtp^="/Users/****/.opam/system/share/ocp-indent/vim"
=-=- ocp-index.1.1.5 installed successfully -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires additional configuration for use in editors. Either install package 'user-setup', or manually:
for Emacs, add these lines to ~/.emacs:
(add-to-list 'load-path "/Users/****/.opam/system/share/emacs/site-lisp")
(require 'ocp-index)
for Vim, add the following line to ~/.vimrc:
set rtp+=/Users/****/.opam/system/share/ocp-index/vim
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the gtksourceview' andlibxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
The former state can be restored with:
opam switch import "~/.opam/system/backup/state-20170605100428.export"
``` | True | opam fails to install ocaml-top - Mac version 10.12.5
Hello fail to install
zarith
ocaml-top
----------------------------------------------------------------------------
i was said at https://github.com/ocaml/opam/issues/2992
...Regarding the ocaml-top error please follow the advice given in the error message:
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the gtksourceview' andlibxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
-------------------
OK I have tried
```
Opam install gtksourceview
[ERROR] No package named gtksourceview found.
opam list -a | grep -i "GtkSourceView"
conf-gtksourceview -- Virtual package relying on a G
******************$ sudo Opam install conf-gtksourceview
Password:
[WARNING] Running as root is not recommended
The following actions will be performed:
∗ install conf-gtksourceview 2
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[ERROR] The compilation of conf-gtksourceview failed at "pkg-config --short-errors --print-errors gtksourceview-2.0".
#=== ERROR while installing conf-gtksourceview.2 ==============================#
# opam-version 1.2.2
# os darwin
# command pkg-config --short-errors --print-errors gtksourceview-2.0
# path /Users/****/.opam/system/build/conf-gtksourceview.2
# compiler system (4.04.2)
# exit-code 127
# env-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.env
# stdout-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.out
# stderr-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-800-7afd23.err
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions failed
∗ install conf-gtksourceview 2
No changes have been performed
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the `gtksourceview' and `libxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
----------------------------------------------------------------------------
Before
i ihave done
brew install opam
opam init
opam install zarith
opam install ocaml-top
Here the report
Somebody knows the issue ?
Thanks
*******:ocaml ****$ opam init
Checking for available remotes: rsync and local, git.
you won't be able to use mercurial repositories unless you install the hg command on your system.
you won't be able to use darcs repositories unless you install the darcs command on your system.
=-=- Fetching repository information =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] synchronized from https://opam.ocaml.org
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed base-bigarray.base
∗ installed base-threads.base
∗ installed base-unix.base
Done.
In normal operation, OPAM only alters files within ~/.opam.
During this initialisation, you can allow OPAM to add information to two
other files for best results. You can also make these additions manually
if you wish.
If you agree, OPAM will modify:
~/.bash_profile (or a file you specify) to set the right environment
variables and to load the auto-completion scripts for your shell (bash)
on startup. Specifically, it checks for and appends the following line:
. /Users/****/.opam/opam-init/init.sh > /dev/null 2> /dev/null || true
~/.ocamlinit to ensure that non-system installations of ocamlfind
(i.e. those installed by OPAM) will work correctly when running the
OCaml toplevel. It does this by adding $OCAML_TOPLEVEL_PATH to the list
of include directories.
If you choose to not configure your system now, you can either configure
OPAM manually (instructions will be displayed) or launch the automatic setup
later by running:
opam config setup -a
Do you want OPAM to modify ~/.bash_profile and ~/.ocamlinit?
(default is 'no', use 'f' to name a file other than ~/.bash_profile)
[N/y/f]
Global configuration:
Updating ~/.opam/opam-init/init.sh
Updating ~/.opam/opam-init/init.zsh
Updating ~/.opam/opam-init/init.csh
Updating ~/.opam/opam-init/init.fish
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
To configure OPAM in the current shell session, you need to run:
eval opam config env
To correctly configure OPAM for subsequent use, add the following
line to your profile file (for instance ~/.bash_profile):
. /Users/****/.opam/opam-init/init.sh > /dev/null 2> /dev/null || true
To avoid issues related to non-system installations of ocamlfind
add the following lines to ~/.ocamlinit (create it if necessary):
let () =
try Topdirs.dir_directory (Sys.getenv "OCAML_TOPLEVEL_PATH")
with Not_found -> ()
;;
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
*******:ocaml ****$ opam --version
1.2.2
*******:ocaml ****$ opam install zarith
The following actions will be performed:
∗ install conf-perl 1 [required by zarith]
∗ install conf-m4 1 [required by ocamlfind]
∗ install conf-gmp 1 [required by zarith]
∗ install ocamlfind 1.7.3 [required by zarith]
∗ install zarith 1.5
===== ∗ 5 =====
Do you want to continue ? [Y/n] y
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] https://opam.ocaml.org/archives/zarith.1.5+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocamlfind.1.7.3+opam.tar.gz downloaded
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed conf-perl.1
[ERROR] The compilation of conf-gmp failed at "sh -exc cc -c $CFLAGS -I/usr/local/include test.c".
∗ installed conf-m4.1
∗ installed ocamlfind.1.7.3
#=== ERROR while installing conf-gmp.1 ========================================#
opam-version 1.2.2
os darwin
command sh -exc cc -c $CFLAGS -I/usr/local/include test.c
path /Users/****/.opam/system/build/conf-gmp.1
compiler system (4.04.2)
exit-code 1
env-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.env
stdout-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.out
stderr-file /Users/****/.opam/system/build/conf-gmp.1/conf-gmp-3884-d2d111.err
stderr
+ cc -c -I/usr/local/include test.c
test.c:1:10: fatal error: 'gmp.h' file not found
#include <gmp.h>
^
1 error generated.
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions were aborted
∗ install zarith 1.5
The following actions failed
∗ install conf-gmp 1
The following changes have been performed
∗ install conf-m4 1
∗ install conf-perl 1
∗ install ocamlfind 1.7.3
=-=- conf-gmp.1 troobleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package relies on external (system) dependencies that may be missing. `opam depext conf-gmp.1' may help you find the correct installation for your system.
The former state can be restored with:
opam switch import "~/.opam/system/backup/state-20170604184735.export"
Last login: Wed Jul 5 11:29:19 on ttys000
*******:~ ****$ opam install ocaml-top
The following actions will be performed:
∗ install conf-gtksourceview 2 [required by ocaml-top]
∗ install ocp-build 1.99.19-beta [required by ocaml-top]
∗ install result 1.2 [required by cmdliner]
∗ install base-bytes base [required by re]
∗ install ocamlbuild 0.11.0 [required by cmdliner, re]
∗ install lablgtk 2.18.5 [required by ocaml-top]
∗ install topkg 0.9.0 [required by cmdliner]
∗ install re 1.7.1 [required by ocp-index]
∗ install cmdliner 1.0.0 [required by ocp-indent, ocp-index]
∗ install ocp-indent 1.6.0 [required by ocaml-top]
∗ install ocp-index 1.1.5 [required by ocaml-top]
For ocp-browser, please also install package lambda-term
∗ install ocaml-top 1.1.3
===== ∗ 12 =====
Do you want to continue ? [Y/n] y
=-=- Gathering sources =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[default] https://opam.ocaml.org/archives/cmdliner.1.0.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocaml-top.1.1.3+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocamlbuild.0.11.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocp-build.1.99.19-beta+opam.tar.gz downloaded
Processing 9/12: [lablgtk: from default] [ocp-indent: from default] [ocp-index: from default[default] https://opam.ocaml.org/archives/ocp-indent.1.6.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/lablgtk.2.18.5+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/result.1.2+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/re.1.7.1+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/topkg.0.9.0+opam.tar.gz downloaded
[default] https://opam.ocaml.org/archives/ocp-index.1.1.5+opam.tar.gz downloaded
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
∗ installed base-bytes.base
[ERROR] The compilation of conf-gtksourceview failed at "pkg-config --short-errors --print-errors gtksourceview-2.0".
∗ installed result.1.2
∗ installed ocamlbuild.0.11.0
∗ installed topkg.0.9.0
∗ installed cmdliner.1.0.0
∗ installed re.1.7.1
∗ installed ocp-build.1.99.19-beta
∗ installed ocp-indent.1.6.0
∗ installed ocp-index.1.1.5
#=== ERROR while installing conf-gtksourceview.2 ==============================#
opam-version 1.2.2
os darwin
command pkg-config --short-errors --print-errors gtksourceview-2.0
path /Users/****/.opam/system/build/conf-gtksourceview.2
compiler system (4.04.2)
exit-code 127
env-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.env
stdout-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.out
stderr-file /Users/****/.opam/system/build/conf-gtksourceview.2/conf-gtksourceview-615-48c407.err
=-=- Error report -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following actions were aborted
∗ install lablgtk 2.18.5
∗ install ocaml-top 1.1.3
The following actions failed
∗ install conf-gtksourceview 2
The following changes have been performed
∗ install base-bytes base
∗ install cmdliner 1.0.0
∗ install ocamlbuild 0.11.0
∗ install ocp-build 1.99.19-beta
∗ install ocp-indent 1.6.0
∗ install ocp-index 1.1.5
∗ install re 1.7.1
∗ install result 1.2
∗ install topkg 0.9.0
=-=- ocp-indent.1.6.0 installed successfully =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires additional configuration for use in editors. Install package 'user-setup', or manually:
for Emacs, add these lines to ~/.emacs:
(add-to-list 'load-path "/Users/****/.opam/system/share/emacs/site-lisp")
(require 'ocp-indent)
for Vim, add this line to ~/.vimrc:
set rtp^="/Users/****/.opam/system/share/ocp-indent/vim"
=-=- ocp-index.1.1.5 installed successfully -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires additional configuration for use in editors. Either install package 'user-setup', or manually:
for Emacs, add these lines to ~/.emacs:
(add-to-list 'load-path "/Users/****/.opam/system/share/emacs/site-lisp")
(require 'ocp-index)
for Vim, add the following line to ~/.vimrc:
set rtp+=/Users/****/.opam/system/share/ocp-index/vim
=-=- conf-gtksourceview.2 troubleshooting -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
=> This package requires gtksourceview 2.0 development packages installed on your system
=> On OSX, install the gtksourceview' andlibxml2' packages and define PKG_CONFIG_PATH:
export PKG_CONFIG_PATH=/usr/X11/lib/pkgconfig/:/usr/local/lib/pkgconfig:/usr/local/opt/libxml2/lib/pkgconfig/
The former state can be restored with:
opam switch import "~/.opam/system/backup/state-20170605100428.export"
``` | main | opam fails to install ocaml top mac version hello fail to install zarith ocaml top i was said at regarding the ocaml top error please follow the advice given in the error message conf gtksourceview troubleshooting this package requires gtksourceview development packages installed on your system on osx install the gtksourceview packages and define pkg config path export pkg config path usr lib pkgconfig usr local lib pkgconfig usr local opt lib pkgconfig ok i have tried opam install gtksourceview no package named gtksourceview found opam list a grep i gtksourceview conf gtksourceview virtual package relying on a g sudo opam install conf gtksourceview password running as root is not recommended the following actions will be performed ∗ install conf gtksourceview gathering sources processing actions the compilation of conf gtksourceview failed at pkg config short errors print errors gtksourceview error while installing conf gtksourceview opam version os darwin command pkg config short errors print errors gtksourceview path users opam system build conf gtksourceview compiler system exit code env file users opam system build conf gtksourceview conf gtksourceview env stdout file users opam system build conf gtksourceview conf gtksourceview out stderr file users opam system build conf gtksourceview conf gtksourceview err error report the following actions failed ∗ install conf gtksourceview no changes have been performed conf gtksourceview troubleshooting this package requires gtksourceview development packages installed on your system on osx install the gtksourceview and packages and define pkg config path export pkg config path usr lib pkgconfig usr local lib pkgconfig usr local opt lib pkgconfig before i ihave done brew install opam opam init opam install zarith opam install ocaml top here the report somebody knows the issue thanks ocaml opam init checking for available remotes rsync and local git you won t be able to use mercurial repositories unless you install the hg command on your system you won t be able to use darcs repositories unless you install the darcs command on your system fetching repository information synchronized from gathering sources processing actions ∗ installed base bigarray base ∗ installed base threads base ∗ installed base unix base done in normal operation opam only alters files within opam during this initialisation you can allow opam to add information to two other files for best results you can also make these additions manually if you wish if you agree opam will modify bash profile or a file you specify to set the right environment variables and to load the auto completion scripts for your shell bash on startup specifically it checks for and appends the following line users opam opam init init sh dev null dev null true ocamlinit to ensure that non system installations of ocamlfind i e those installed by opam will work correctly when running the ocaml toplevel it does this by adding ocaml toplevel path to the list of include directories if you choose to not configure your system now you can either configure opam manually instructions will be displayed or launch the automatic setup later by running opam config setup a do you want opam to modify bash profile and ocamlinit default is no use f to name a file other than bash profile global configuration updating opam opam init init sh updating opam opam init init zsh updating opam opam init init csh updating opam opam init init fish to configure opam in the current shell session you need to run eval opam config env to correctly configure opam for subsequent use add the following line to your profile file for instance bash profile users opam opam init init sh dev null dev null true to avoid issues related to non system installations of ocamlfind add the following lines to ocamlinit create it if necessary let try topdirs dir directory sys getenv ocaml toplevel path with not found ocaml opam version ocaml opam install zarith the following actions will be performed ∗ install conf perl ∗ install conf ∗ install conf gmp ∗ install ocamlfind ∗ install zarith ∗ do you want to continue y gathering sources downloaded downloaded processing actions ∗ installed conf perl the compilation of conf gmp failed at sh exc cc c cflags i usr local include test c ∗ installed conf ∗ installed ocamlfind error while installing conf gmp opam version os darwin command sh exc cc c cflags i usr local include test c path users opam system build conf gmp compiler system exit code env file users opam system build conf gmp conf gmp env stdout file users opam system build conf gmp conf gmp out stderr file users opam system build conf gmp conf gmp err stderr cc c i usr local include test c test c fatal error gmp h file not found include error generated error report the following actions were aborted ∗ install zarith the following actions failed ∗ install conf gmp the following changes have been performed ∗ install conf ∗ install conf perl ∗ install ocamlfind conf gmp troobleshooting this package relies on external system dependencies that may be missing opam depext conf gmp may help you find the correct installation for your system the former state can be restored with opam switch import opam system backup state export last login wed jul on opam install ocaml top the following actions will be performed ∗ install conf gtksourceview ∗ install ocp build beta ∗ install result ∗ install base bytes base ∗ install ocamlbuild ∗ install lablgtk ∗ install topkg ∗ install re ∗ install cmdliner ∗ install ocp indent ∗ install ocp index for ocp browser please also install package lambda term ∗ install ocaml top ∗ do you want to continue y gathering sources downloaded downloaded downloaded downloaded processing downloaded downloaded downloaded downloaded downloaded downloaded processing actions ∗ installed base bytes base the compilation of conf gtksourceview failed at pkg config short errors print errors gtksourceview ∗ installed result ∗ installed ocamlbuild ∗ installed topkg ∗ installed cmdliner ∗ installed re ∗ installed ocp build beta ∗ installed ocp indent ∗ installed ocp index error while installing conf gtksourceview opam version os darwin command pkg config short errors print errors gtksourceview path users opam system build conf gtksourceview compiler system exit code env file users opam system build conf gtksourceview conf gtksourceview env stdout file users opam system build conf gtksourceview conf gtksourceview out stderr file users opam system build conf gtksourceview conf gtksourceview err error report the following actions were aborted ∗ install lablgtk ∗ install ocaml top the following actions failed ∗ install conf gtksourceview the following changes have been performed ∗ install base bytes base ∗ install cmdliner ∗ install ocamlbuild ∗ install ocp build beta ∗ install ocp indent ∗ install ocp index ∗ install re ∗ install result ∗ install topkg ocp indent installed successfully this package requires additional configuration for use in editors install package user setup or manually for emacs add these lines to emacs add to list load path users opam system share emacs site lisp require ocp indent for vim add this line to vimrc set rtp users opam system share ocp indent vim ocp index installed successfully this package requires additional configuration for use in editors either install package user setup or manually for emacs add these lines to emacs add to list load path users opam system share emacs site lisp require ocp index for vim add the following line to vimrc set rtp users opam system share ocp index vim conf gtksourceview troubleshooting this package requires gtksourceview development packages installed on your system on osx install the gtksourceview packages and define pkg config path export pkg config path usr lib pkgconfig usr local lib pkgconfig usr local opt lib pkgconfig the former state can be restored with opam switch import opam system backup state export | 1 |
4,746 | 24,487,561,558 | IssuesEvent | 2022-10-09 16:41:57 | Vivelin/SMZ3Randomizer | https://api.github.com/repos/Vivelin/SMZ3Randomizer | closed | Merge Tracker and Randomizer classes | :wrench: maintainability | Currently it can be a bit confusing as there are basically two separates sets of objects (for the most part). There are randomizer Objects such as Region, Item, Location, etc. and then there are tracker objects such as RegionInfo, ItemInfo, LocationInfo, etc. This ends up with a lot of duplicate and conflicting sets of data, such as what is considered progression.
My thought would be once we pull over the tracker classes and configs into their own project, we merge all properties in the randomizer classes into the tracker classes and use them instead for all things except for Logic. Logic could be split out into its own project or section and basically just be classes and properties with the logic for each location that is referenced by the tracker objects. | True | Merge Tracker and Randomizer classes - Currently it can be a bit confusing as there are basically two separates sets of objects (for the most part). There are randomizer Objects such as Region, Item, Location, etc. and then there are tracker objects such as RegionInfo, ItemInfo, LocationInfo, etc. This ends up with a lot of duplicate and conflicting sets of data, such as what is considered progression.
My thought would be once we pull over the tracker classes and configs into their own project, we merge all properties in the randomizer classes into the tracker classes and use them instead for all things except for Logic. Logic could be split out into its own project or section and basically just be classes and properties with the logic for each location that is referenced by the tracker objects. | main | merge tracker and randomizer classes currently it can be a bit confusing as there are basically two separates sets of objects for the most part there are randomizer objects such as region item location etc and then there are tracker objects such as regioninfo iteminfo locationinfo etc this ends up with a lot of duplicate and conflicting sets of data such as what is considered progression my thought would be once we pull over the tracker classes and configs into their own project we merge all properties in the randomizer classes into the tracker classes and use them instead for all things except for logic logic could be split out into its own project or section and basically just be classes and properties with the logic for each location that is referenced by the tracker objects | 1 |
352 | 3,255,609,058 | IssuesEvent | 2015-10-20 09:33:33 | embox/embox | https://api.github.com/repos/embox/embox | closed | Remove non C-library headers from root include | accepted bug fixed imported maintainability module:lib prio:normal | C standard library headers are listed here: http://en.wikipedia.org/wiki/Category:C_standard_library_headers
These headers should be moved or dropped at all:
header | destination
--------------- | -----------
bitops.h | util/
crc32.h | util/crypt
err.h | drop
getopt.h | compat/posix/include/ or rewrite into cmd/util.h
md5.h | util/crypt
netutils.h | net/misc.h or net/util.h
open_prom.h | ???
queue.h | util/queue.h or util/list.h
shell_command.h | embox/cmd.h
shell_utils.h | drop
termios.h | compat/posix/include/
time.h | compat/posix/include/
types.h | drop or move into embox/types.h
unistd.h | compat/posix/include/ | True | Remove non C-library headers from root include - C standard library headers are listed here: http://en.wikipedia.org/wiki/Category:C_standard_library_headers
These headers should be moved or dropped at all:
header | destination
--------------- | -----------
bitops.h | util/
crc32.h | util/crypt
err.h | drop
getopt.h | compat/posix/include/ or rewrite into cmd/util.h
md5.h | util/crypt
netutils.h | net/misc.h or net/util.h
open_prom.h | ???
queue.h | util/queue.h or util/list.h
shell_command.h | embox/cmd.h
shell_utils.h | drop
termios.h | compat/posix/include/
time.h | compat/posix/include/
types.h | drop or move into embox/types.h
unistd.h | compat/posix/include/ | main | remove non c library headers from root include c standard library headers are listed here these headers should be moved or dropped at all header destination bitops h util h util crypt err h drop getopt h compat posix include or rewrite into cmd util h h util crypt netutils h net misc h or net util h open prom h queue h util queue h or util list h shell command h embox cmd h shell utils h drop termios h compat posix include time h compat posix include types h drop or move into embox types h unistd h compat posix include | 1 |
150,062 | 5,735,812,619 | IssuesEvent | 2017-04-22 01:48:54 | zeit/next.js | https://api.github.com/repos/zeit/next.js | closed | Resolve Webpack deprecation warning for loaderUtils.parseQuery() | Priority: Low | Seeing the following deprecation warning after updating to the latest version of Next beta. Can anyone else confirm?
```
DeprecationWarning: loaderUtils.parseQuery() received a non-string value which can be problematic, see https://github.com/webpack/loader-utils/issues/56 parseQuery() will be replaced with getOptions() in the next major version of loader-utils.
```` | 1.0 | Resolve Webpack deprecation warning for loaderUtils.parseQuery() - Seeing the following deprecation warning after updating to the latest version of Next beta. Can anyone else confirm?
```
DeprecationWarning: loaderUtils.parseQuery() received a non-string value which can be problematic, see https://github.com/webpack/loader-utils/issues/56 parseQuery() will be replaced with getOptions() in the next major version of loader-utils.
```` | non_main | resolve webpack deprecation warning for loaderutils parsequery seeing the following deprecation warning after updating to the latest version of next beta can anyone else confirm deprecationwarning loaderutils parsequery received a non string value which can be problematic see parsequery will be replaced with getoptions in the next major version of loader utils | 0 |
3,405 | 13,181,832,970 | IssuesEvent | 2020-08-12 14:53:41 | duo-labs/cloudmapper | https://api.github.com/repos/duo-labs/cloudmapper | closed | Add MSK support for network map | map unmaintained_functionality | I had grabbed MSK data for the audit, but am not using that anymore. I should use this for the network map. | True | Add MSK support for network map - I had grabbed MSK data for the audit, but am not using that anymore. I should use this for the network map. | main | add msk support for network map i had grabbed msk data for the audit but am not using that anymore i should use this for the network map | 1 |
572,688 | 17,023,519,648 | IssuesEvent | 2021-07-03 02:26:44 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Search for more types of nodes | Component: nominatim Priority: minor Resolution: fixed Type: enhancement | **[Submitted to the original trac issue database at 7.20am, Sunday, 29th November 2009]**
This is about OpenSatNav again. We have a POI search that will find the nearest restaurant, pub, etc. If I search for "pubs near adelaide" Nominatum seems to do the right thing. However, some places such as hotels and ATMs don't work. It would be awesome if Nominatum could find everything that counts as an amenity in OSM. | 1.0 | Search for more types of nodes - **[Submitted to the original trac issue database at 7.20am, Sunday, 29th November 2009]**
This is about OpenSatNav again. We have a POI search that will find the nearest restaurant, pub, etc. If I search for "pubs near adelaide" Nominatum seems to do the right thing. However, some places such as hotels and ATMs don't work. It would be awesome if Nominatum could find everything that counts as an amenity in OSM. | non_main | search for more types of nodes this is about opensatnav again we have a poi search that will find the nearest restaurant pub etc if i search for pubs near adelaide nominatum seems to do the right thing however some places such as hotels and atms don t work it would be awesome if nominatum could find everything that counts as an amenity in osm | 0 |
174,531 | 14,485,617,298 | IssuesEvent | 2020-12-10 17:49:00 | MyFormworks/Formworks | https://api.github.com/repos/MyFormworks/Formworks | closed | Make an release branch | dev-task documentation tools | Our framework currently supports only text components so, for our first release, we should only include file that are being used. | 1.0 | Make an release branch - Our framework currently supports only text components so, for our first release, we should only include file that are being used. | non_main | make an release branch our framework currently supports only text components so for our first release we should only include file that are being used | 0 |
67,228 | 8,114,402,704 | IssuesEvent | 2018-08-15 00:07:29 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Index Pattern page to display the error response | :Design release_note:enhancement | Using 5.1
The UI does not display any error message when it can't find the index pattern. The UI should display an error message with useful information. In this case, Kibana is unable to find a matching index pattern because the logged in user does not have sufficient permissions to search for indices.

cc: @spalger | 1.0 | Index Pattern page to display the error response - Using 5.1
The UI does not display any error message when it can't find the index pattern. The UI should display an error message with useful information. In this case, Kibana is unable to find a matching index pattern because the logged in user does not have sufficient permissions to search for indices.

cc: @spalger | non_main | index pattern page to display the error response using the ui does not display any error message when it can t find the index pattern the ui should display an error message with useful information in this case kibana is unable to find a matching index pattern because the logged in user does not have sufficient permissions to search for indices cc spalger | 0 |
5,419 | 27,207,910,619 | IssuesEvent | 2023-02-20 14:25:50 | warengonzaga/thirdweb-support-discord-bot | https://api.github.com/repos/warengonzaga/thirdweb-support-discord-bot | opened | refactor code | chore maintainers only tweak | ## Notes
- Use `[for..of]` and `Collection.find()` to check user IDs of messages within the threads.
- Modular each feature.
- Structure the bot files. | True | refactor code - ## Notes
- Use `[for..of]` and `Collection.find()` to check user IDs of messages within the threads.
- Modular each feature.
- Structure the bot files. | main | refactor code notes use and collection find to check user ids of messages within the threads modular each feature structure the bot files | 1 |
3,450 | 13,214,035,387 | IssuesEvent | 2020-08-16 15:45:36 | ansible/ansible | https://api.github.com/repos/ansible/ansible | closed | Terraform documentation should have more examples | affects_2.8 bot_closed bug cloud collection collection:community.general has_pr module needs_collection_redirect needs_maintainer support:community | ##### SUMMARY
The Terraform documentation currently only has two examples. Both relate to `init` related Terraform examples. Additional examples showing proper usage of `plan` and `apply` could be useful, including refresh operations since Terraform may get out of sync with production due to Ansible.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
Terraform
##### ANSIBLE VERSION
```paste below
2.8.0
``` | True | Terraform documentation should have more examples - ##### SUMMARY
The Terraform documentation currently only has two examples. Both relate to `init` related Terraform examples. Additional examples showing proper usage of `plan` and `apply` could be useful, including refresh operations since Terraform may get out of sync with production due to Ansible.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
Terraform
##### ANSIBLE VERSION
```paste below
2.8.0
``` | main | terraform documentation should have more examples summary the terraform documentation currently only has two examples both relate to init related terraform examples additional examples showing proper usage of plan and apply could be useful including refresh operations since terraform may get out of sync with production due to ansible issue type bug report component name terraform ansible version paste below | 1 |
5,858 | 31,388,243,560 | IssuesEvent | 2023-08-26 02:42:27 | backdrop-contrib/rules | https://api.github.com/repos/backdrop-contrib/rules | closed | Error: Class 'RulesLog' not found in rules_update_1000() | type - bug pr - maintainer review requested | I'm trying to Migrate a Drupal 7.92 site using Rules 7.x-2.13 to a fresh Backdrop 1.23.0 site using Rules 1.x-2.3.2. When I get to the core/update.php step I'm getting an error:
```Internal Server Error ResponseText: Error: Class 'RulesLog' not found in rules_update_1000() (line 205 of /var/www/html/modules/contrib/rules/rules.install).```
Feels like I'm missing something basic here? Anyone have ideas? | True | Error: Class 'RulesLog' not found in rules_update_1000() - I'm trying to Migrate a Drupal 7.92 site using Rules 7.x-2.13 to a fresh Backdrop 1.23.0 site using Rules 1.x-2.3.2. When I get to the core/update.php step I'm getting an error:
```Internal Server Error ResponseText: Error: Class 'RulesLog' not found in rules_update_1000() (line 205 of /var/www/html/modules/contrib/rules/rules.install).```
Feels like I'm missing something basic here? Anyone have ideas? | main | error class ruleslog not found in rules update i m trying to migrate a drupal site using rules x to a fresh backdrop site using rules x when i get to the core update php step i m getting an error internal server error responsetext error class ruleslog not found in rules update line of var www html modules contrib rules rules install feels like i m missing something basic here anyone have ideas | 1 |
892 | 4,553,457,459 | IssuesEvent | 2016-09-13 04:57:46 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | Bad value substitution in ini_file Module with percent placeholder values | affects_1.7 bug_report P3 waiting_on_maintainer | ##### Issue Type:
Bug
##### Component Name:
ini_file module
##### Ansible Version:
1.7.2
1.8.2
##### Environment:
Mac OSX 10.10.1 Yosemite
##### Summary:
I need special values in an ini-file to configure my supervisor-daemon. For example a value looks like this:
process_name=%(program_name)s
##### Steps To Reproduce:
To reporduce this issue run following playbook twice.
```
---
- hosts: all
tasks:
- ini_file: dest="/tmp/tmp.ini" section="program:update" option="process_name" value="%(program_name)s"
```
##### Expected Results:
After first run everything OK, after second run I get an error and nothing happens.
##### Actual Results:
```
ConfigParser.InterpolationMissingOptionError: Bad value substitution:
section: [program:update]
option : process_name
key : program_name
rawval : %(program_name)s
``` | True | Bad value substitution in ini_file Module with percent placeholder values - ##### Issue Type:
Bug
##### Component Name:
ini_file module
##### Ansible Version:
1.7.2
1.8.2
##### Environment:
Mac OSX 10.10.1 Yosemite
##### Summary:
I need special values in an ini-file to configure my supervisor-daemon. For example a value looks like this:
process_name=%(program_name)s
##### Steps To Reproduce:
To reporduce this issue run following playbook twice.
```
---
- hosts: all
tasks:
- ini_file: dest="/tmp/tmp.ini" section="program:update" option="process_name" value="%(program_name)s"
```
##### Expected Results:
After first run everything OK, after second run I get an error and nothing happens.
##### Actual Results:
```
ConfigParser.InterpolationMissingOptionError: Bad value substitution:
section: [program:update]
option : process_name
key : program_name
rawval : %(program_name)s
``` | main | bad value substitution in ini file module with percent placeholder values issue type bug component name ini file module ansible version environment mac osx yosemite summary i need special values in an ini file to configure my supervisor daemon for example a value looks like this process name program name s steps to reproduce to reporduce this issue run following playbook twice hosts all tasks ini file dest tmp tmp ini section program update option process name value program name s expected results after first run everything ok after second run i get an error and nothing happens actual results configparser interpolationmissingoptionerror bad value substitution section option process name key program name rawval program name s | 1 |
133,721 | 18,947,048,807 | IssuesEvent | 2021-11-18 11:18:29 | w3c/w3c-website | https://api.github.com/repos/w3c/w3c-website | closed | Upload the compiled assets to our CDN | design system | **Describe the issue**
We need a single, consistent location to load the design system static assets for the w3.org site and others.
@deniak has recommended:
> After giving more thoughts to that topic, the npm package might not be necessary and will add more complexity to the process (version management, etc).
> We also expect most people to simply use the different components without having to deal with SASS or to have some knowledge of npm or composer.
> In the end, it'll just be simpler to upload the compiled assets to our CDN so that people can just link to these and start using the components in their pages.
> We don't need to keep different versions available and the CDN will always be up-to-date with the latest revision of the repository under the following URL:
> https://cdn.w3.org/assets/website-2021/{styles,js,svg,images,fonts}/
>
> We are thinking about building a GitHub action to take care of validating the different assets, upload them to the CDN and invalidate the cache.
**Solution**
1. Work on GitHub action to copy static assets into the CDN once code is merged into main.
2. Update [getting started](https://github.com/w3c/w3c-website-templates-bundle/blob/main/docs/getting-started.md) docs on how users can use the design system assets outside of Symfony.
**Additional context**
See Basecamp thread on this https://3.basecamp.com/3091560/buckets/15695452/messages/4121857311
| 1.0 | Upload the compiled assets to our CDN - **Describe the issue**
We need a single, consistent location to load the design system static assets for the w3.org site and others.
@deniak has recommended:
> After giving more thoughts to that topic, the npm package might not be necessary and will add more complexity to the process (version management, etc).
> We also expect most people to simply use the different components without having to deal with SASS or to have some knowledge of npm or composer.
> In the end, it'll just be simpler to upload the compiled assets to our CDN so that people can just link to these and start using the components in their pages.
> We don't need to keep different versions available and the CDN will always be up-to-date with the latest revision of the repository under the following URL:
> https://cdn.w3.org/assets/website-2021/{styles,js,svg,images,fonts}/
>
> We are thinking about building a GitHub action to take care of validating the different assets, upload them to the CDN and invalidate the cache.
**Solution**
1. Work on GitHub action to copy static assets into the CDN once code is merged into main.
2. Update [getting started](https://github.com/w3c/w3c-website-templates-bundle/blob/main/docs/getting-started.md) docs on how users can use the design system assets outside of Symfony.
**Additional context**
See Basecamp thread on this https://3.basecamp.com/3091560/buckets/15695452/messages/4121857311
| non_main | upload the compiled assets to our cdn describe the issue we need a single consistent location to load the design system static assets for the org site and others deniak has recommended after giving more thoughts to that topic the npm package might not be necessary and will add more complexity to the process version management etc we also expect most people to simply use the different components without having to deal with sass or to have some knowledge of npm or composer in the end it ll just be simpler to upload the compiled assets to our cdn so that people can just link to these and start using the components in their pages we don t need to keep different versions available and the cdn will always be up to date with the latest revision of the repository under the following url we are thinking about building a github action to take care of validating the different assets upload them to the cdn and invalidate the cache solution work on github action to copy static assets into the cdn once code is merged into main update docs on how users can use the design system assets outside of symfony additional context see basecamp thread on this | 0 |
5,355 | 26,965,905,746 | IssuesEvent | 2023-02-08 22:20:39 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | Proposal for a better sam local support of docker-machine | type/ux type/feature area/docker stage/needs-feedback type/feedback maintainer/need-response | ## Background
I use docker-machine as my docker daemon, so I have one VM per project, which I can destroy when I'm done, keeping my host system pristine and without docker installed.
I've been meaning to use SAM CLI, and so far I couldn't.
So I set out investigating, and here is the explanation of the problem, and a proposal for a solution that, if implemented, will save a lot of support issues and confusion amongs docker-machine users in future.
Maybe docker-machine afficionados know all this already, but it took me enough hours to figure out that I wanted to provide the canonical solution, to leave things in a better state than I found it.
And if you're not going to implement the proposed solution, then maybe you'll at least add it as a known problem with the suggested solution, as part of the documentation.
## Problem
`sam local` expects the filesystem of the sam application source code to be visible to the docker daemon, which **it is not** when using docker-machine. docker-machine runs in a separate (usually VirtualBox) VM.
Taking the example of java8 HelloWorld:
```text
2019-07-06 11:13:13 Mounting /tmp/sam-app/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container
```
there are three environments at work
- my work machine, which contains the /tmp/sam-app source directory
- the docker-machine VM, which does not contain any /tmp/sam-app directory
- the container which has /var/task mount that points to the non-existent /tmp/sam-app directory in the docker-machine VM. ie /var/task is **empty**
The missing link is between my work machine /tmp/sam-app and the docker-machine VM /tmp/sam-app. Unless the user takes extra steps to share/mount/sync/ the source code directory under **the same path** and **with read permissions to the docker daemon** in the docker-machine VM, the /var/task mount will be empty.
A couple issues I've seen related to this problem are:
- #1126: The problem disappeared when the reporter stopped using docker-toolbox, which is basically docker-machine
- #571: The one that I first encountered, ClassNotFoundException there basically means the sam source directory is nowhere to be found in the docker-machine VM, it is not a permission issue, as I was led to believe.
Since there is no obvious way from the output of sam local to determine whether a user is using docker-machine (hint: there should be), there may be more instances of this problem lurking around in the issue tracker.
## Proposed solution
In the `sam build` step, if:
- `${DOCKER_MACHINE_NAME}` is defined **AND**
- `docker-machine ssh "${DOCKER_MACHINE_NAME}" uname -r` ends with `boot2docker`
Then `sam` should perform the following additional steps (or their equivalent through python)
```bash
docker-machine ssh ${DOCKER_MACHINE_NAME} \
"mkdir -p \"$(pwd)\" && chown docker.docker \"$(pwd)\""
docker-machine scp --recursive --delta $(pwd) ${DOCKER_MACHINE_NAME}:$(pwd)
```
By doing these steps manually, I was finally able to run `sam local invoke` using docker-machine.
```text
/tmp/sam-app$ docker-machine scp --recursive --delta $(pwd) ${DOCKER_MACHINE_NAME}:$(pwd)
sending incremental file list
sam-app/
sam-app/README.md
4,929 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=24/26)
sam-app/empty.json
3 100% 2.93kB/s 0:00:00 (xfr#2, to-chk=23/26)
sam-app/issue.md
1,293 100% 1.23MB/s 0:00:00 (xfr#3, to-chk=22/26)
sam-app/template.yaml
1,837 100% 1.75MB/s 0:00:00 (xfr#4, to-chk=21/26)
sam-app/.aws-sam/
sam-app/.aws-sam/build/
sam-app/.aws-sam/build/template.yaml
1,072 100% 1.02MB/s 0:00:00 (xfr#5, to-chk=17/26)
sam-app/.aws-sam/build/HelloWorldFunction/
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/App.class
2,877 100% 2.74MB/s 0:00:00 (xfr#6, to-chk=13/26)
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/GatewayResponse.class
1,216 100% 1.16MB/s 0:00:00 (xfr#7, to-chk=12/26)
sam-app/.aws-sam/build/HelloWorldFunction/lib/
sam-app/.aws-sam/build/HelloWorldFunction/lib/aws-lambda-java-core-1.2.0.jar
7,410 100% 7.07MB/s 0:00:00 (xfr#8, to-chk=11/26)
sam-app/HelloWorldFunction/
sam-app/HelloWorldFunction/pom.xml
1,509 100% 1.44MB/s 0:00:00 (xfr#9, to-chk=10/26)
sam-app/HelloWorldFunction/src/
sam-app/HelloWorldFunction/src/main/
sam-app/HelloWorldFunction/src/main/java/
sam-app/HelloWorldFunction/src/main/java/helloworld/
sam-app/HelloWorldFunction/src/main/java/helloworld/App.java
1,399 100% 1.33MB/s 0:00:00 (xfr#10, to-chk=4/26)
sam-app/HelloWorldFunction/src/main/java/helloworld/GatewayResponse.java
760 100% 742.19kB/s 0:00:00 (xfr#11, to-chk=3/26)
sam-app/HelloWorldFunction/src/test/
sam-app/HelloWorldFunction/src/test/java/
sam-app/HelloWorldFunction/src/test/java/helloworld/
sam-app/HelloWorldFunction/src/test/java/helloworld/AppTest.java
701 100% 684.57kB/s 0:00:00 (xfr#12, to-chk=0/26)
(sam-java8-helloworld-classnotfound) lestephane@le-studio-xps:/tmp/sam-app$ sam local invoke
2019-07-06 11:13:44 Reading invoke payload from stdin (you can also pass it from file with --event)
{}
2019-07-06 11:13:49 Invoking helloworld.App::handleRequest (java8)
Fetching lambci/lambda:java8 Docker container image......
2019-07-06 11:13:53 Mounting /tmp/sam-app/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container
START RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c Version: $LATEST
END RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c
REPORT RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c Duration: 1852.32 ms Billed Duration: 1900 ms Memory Size: 128 MB Max Memory Used: 5 MB
{"body":"{ \"message\": \"hello world\", \"location\": \"62.228.125.216\" }","headers":{"X-Custom-Header":"application/json","Content-Type":"application/json"},"statusCode":200}
```
## Caveat: Hot reloads
Hot reloading of code will obviously not work. Basically docker-machine users will have to do `sam build` everytime they change their code. And that's OK I think.
The following statement in the SAM output will therefore not be applicable
```text
You do not need to restart/reload SAM CLI while working on your functions, changes will be reflected instantly/automatically
```
And so reloading is clearly out of scope.
## Why not vboxmanage / virtualbox shared folders?
A workaround that requires the user to have to go modify shared folder settings in VirtualBox, especially when doing a HelloWorld example, adds friction, and still does not ensure that the shared folders behave like real folders, or have the correct permissions. So you end up with more issues from newbies.
In addition, vboxmanage + virtual shared folders are virtualbox specific.
docker-machine can also be used with other machine drivers like Hyper-V and VM-Ware.
## Why not docker-machine mount?
It mounts a filesystem directory of the docker-machine VM as a local source directory, which is rather useless. You want the Docker VM to mount the user's source code directory, not the other way around. And to this day, docker-machine has no mechanism to achieve that.
## Why not NFS, X, Y etc...?
TLDR, as long as a local docker daemon is present, things should work without additional configuration.
A **LOT** of people are using docker-machine, knowingly or not.
The suggested solution shows how the situation can be improved, if not automatically through `sam`, at least in a repeatable manual way.
| True | Proposal for a better sam local support of docker-machine - ## Background
I use docker-machine as my docker daemon, so I have one VM per project, which I can destroy when I'm done, keeping my host system pristine and without docker installed.
I've been meaning to use SAM CLI, and so far I couldn't.
So I set out investigating, and here is the explanation of the problem, and a proposal for a solution that, if implemented, will save a lot of support issues and confusion amongs docker-machine users in future.
Maybe docker-machine afficionados know all this already, but it took me enough hours to figure out that I wanted to provide the canonical solution, to leave things in a better state than I found it.
And if you're not going to implement the proposed solution, then maybe you'll at least add it as a known problem with the suggested solution, as part of the documentation.
## Problem
`sam local` expects the filesystem of the sam application source code to be visible to the docker daemon, which **it is not** when using docker-machine. docker-machine runs in a separate (usually VirtualBox) VM.
Taking the example of java8 HelloWorld:
```text
2019-07-06 11:13:13 Mounting /tmp/sam-app/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container
```
there are three environments at work
- my work machine, which contains the /tmp/sam-app source directory
- the docker-machine VM, which does not contain any /tmp/sam-app directory
- the container which has /var/task mount that points to the non-existent /tmp/sam-app directory in the docker-machine VM. ie /var/task is **empty**
The missing link is between my work machine /tmp/sam-app and the docker-machine VM /tmp/sam-app. Unless the user takes extra steps to share/mount/sync/ the source code directory under **the same path** and **with read permissions to the docker daemon** in the docker-machine VM, the /var/task mount will be empty.
A couple issues I've seen related to this problem are:
- #1126: The problem disappeared when the reporter stopped using docker-toolbox, which is basically docker-machine
- #571: The one that I first encountered, ClassNotFoundException there basically means the sam source directory is nowhere to be found in the docker-machine VM, it is not a permission issue, as I was led to believe.
Since there is no obvious way from the output of sam local to determine whether a user is using docker-machine (hint: there should be), there may be more instances of this problem lurking around in the issue tracker.
## Proposed solution
In the `sam build` step, if:
- `${DOCKER_MACHINE_NAME}` is defined **AND**
- `docker-machine ssh "${DOCKER_MACHINE_NAME}" uname -r` ends with `boot2docker`
Then `sam` should perform the following additional steps (or their equivalent through python)
```bash
docker-machine ssh ${DOCKER_MACHINE_NAME} \
"mkdir -p \"$(pwd)\" && chown docker.docker \"$(pwd)\""
docker-machine scp --recursive --delta $(pwd) ${DOCKER_MACHINE_NAME}:$(pwd)
```
By doing these steps manually, I was finally able to run `sam local invoke` using docker-machine.
```text
/tmp/sam-app$ docker-machine scp --recursive --delta $(pwd) ${DOCKER_MACHINE_NAME}:$(pwd)
sending incremental file list
sam-app/
sam-app/README.md
4,929 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=24/26)
sam-app/empty.json
3 100% 2.93kB/s 0:00:00 (xfr#2, to-chk=23/26)
sam-app/issue.md
1,293 100% 1.23MB/s 0:00:00 (xfr#3, to-chk=22/26)
sam-app/template.yaml
1,837 100% 1.75MB/s 0:00:00 (xfr#4, to-chk=21/26)
sam-app/.aws-sam/
sam-app/.aws-sam/build/
sam-app/.aws-sam/build/template.yaml
1,072 100% 1.02MB/s 0:00:00 (xfr#5, to-chk=17/26)
sam-app/.aws-sam/build/HelloWorldFunction/
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/App.class
2,877 100% 2.74MB/s 0:00:00 (xfr#6, to-chk=13/26)
sam-app/.aws-sam/build/HelloWorldFunction/helloworld/GatewayResponse.class
1,216 100% 1.16MB/s 0:00:00 (xfr#7, to-chk=12/26)
sam-app/.aws-sam/build/HelloWorldFunction/lib/
sam-app/.aws-sam/build/HelloWorldFunction/lib/aws-lambda-java-core-1.2.0.jar
7,410 100% 7.07MB/s 0:00:00 (xfr#8, to-chk=11/26)
sam-app/HelloWorldFunction/
sam-app/HelloWorldFunction/pom.xml
1,509 100% 1.44MB/s 0:00:00 (xfr#9, to-chk=10/26)
sam-app/HelloWorldFunction/src/
sam-app/HelloWorldFunction/src/main/
sam-app/HelloWorldFunction/src/main/java/
sam-app/HelloWorldFunction/src/main/java/helloworld/
sam-app/HelloWorldFunction/src/main/java/helloworld/App.java
1,399 100% 1.33MB/s 0:00:00 (xfr#10, to-chk=4/26)
sam-app/HelloWorldFunction/src/main/java/helloworld/GatewayResponse.java
760 100% 742.19kB/s 0:00:00 (xfr#11, to-chk=3/26)
sam-app/HelloWorldFunction/src/test/
sam-app/HelloWorldFunction/src/test/java/
sam-app/HelloWorldFunction/src/test/java/helloworld/
sam-app/HelloWorldFunction/src/test/java/helloworld/AppTest.java
701 100% 684.57kB/s 0:00:00 (xfr#12, to-chk=0/26)
(sam-java8-helloworld-classnotfound) lestephane@le-studio-xps:/tmp/sam-app$ sam local invoke
2019-07-06 11:13:44 Reading invoke payload from stdin (you can also pass it from file with --event)
{}
2019-07-06 11:13:49 Invoking helloworld.App::handleRequest (java8)
Fetching lambci/lambda:java8 Docker container image......
2019-07-06 11:13:53 Mounting /tmp/sam-app/.aws-sam/build/HelloWorldFunction as /var/task:ro,delegated inside runtime container
START RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c Version: $LATEST
END RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c
REPORT RequestId: 6c4bafc1-2d84-417f-8edc-8f22077d088c Duration: 1852.32 ms Billed Duration: 1900 ms Memory Size: 128 MB Max Memory Used: 5 MB
{"body":"{ \"message\": \"hello world\", \"location\": \"62.228.125.216\" }","headers":{"X-Custom-Header":"application/json","Content-Type":"application/json"},"statusCode":200}
```
## Caveat: Hot reloads
Hot reloading of code will obviously not work. Basically docker-machine users will have to do `sam build` everytime they change their code. And that's OK I think.
The following statement in the SAM output will therefore not be applicable
```text
You do not need to restart/reload SAM CLI while working on your functions, changes will be reflected instantly/automatically
```
And so reloading is clearly out of scope.
## Why not vboxmanage / virtualbox shared folders?
A workaround that requires the user to have to go modify shared folder settings in VirtualBox, especially when doing a HelloWorld example, adds friction, and still does not ensure that the shared folders behave like real folders, or have the correct permissions. So you end up with more issues from newbies.
In addition, vboxmanage + virtual shared folders are virtualbox specific.
docker-machine can also be used with other machine drivers like Hyper-V and VM-Ware.
## Why not docker-machine mount?
It mounts a filesystem directory of the docker-machine VM as a local source directory, which is rather useless. You want the Docker VM to mount the user's source code directory, not the other way around. And to this day, docker-machine has no mechanism to achieve that.
## Why not NFS, X, Y etc...?
TLDR, as long as a local docker daemon is present, things should work without additional configuration.
A **LOT** of people are using docker-machine, knowingly or not.
The suggested solution shows how the situation can be improved, if not automatically through `sam`, at least in a repeatable manual way.
| main | proposal for a better sam local support of docker machine background i use docker machine as my docker daemon so i have one vm per project which i can destroy when i m done keeping my host system pristine and without docker installed i ve been meaning to use sam cli and so far i couldn t so i set out investigating and here is the explanation of the problem and a proposal for a solution that if implemented will save a lot of support issues and confusion amongs docker machine users in future maybe docker machine afficionados know all this already but it took me enough hours to figure out that i wanted to provide the canonical solution to leave things in a better state than i found it and if you re not going to implement the proposed solution then maybe you ll at least add it as a known problem with the suggested solution as part of the documentation problem sam local expects the filesystem of the sam application source code to be visible to the docker daemon which it is not when using docker machine docker machine runs in a separate usually virtualbox vm taking the example of helloworld text mounting tmp sam app aws sam build helloworldfunction as var task ro delegated inside runtime container there are three environments at work my work machine which contains the tmp sam app source directory the docker machine vm which does not contain any tmp sam app directory the container which has var task mount that points to the non existent tmp sam app directory in the docker machine vm ie var task is empty the missing link is between my work machine tmp sam app and the docker machine vm tmp sam app unless the user takes extra steps to share mount sync the source code directory under the same path and with read permissions to the docker daemon in the docker machine vm the var task mount will be empty a couple issues i ve seen related to this problem are the problem disappeared when the reporter stopped using docker toolbox which is basically docker machine the one that i first encountered classnotfoundexception there basically means the sam source directory is nowhere to be found in the docker machine vm it is not a permission issue as i was led to believe since there is no obvious way from the output of sam local to determine whether a user is using docker machine hint there should be there may be more instances of this problem lurking around in the issue tracker proposed solution in the sam build step if docker machine name is defined and docker machine ssh docker machine name uname r ends with then sam should perform the following additional steps or their equivalent through python bash docker machine ssh docker machine name mkdir p pwd chown docker docker pwd docker machine scp recursive delta pwd docker machine name pwd by doing these steps manually i was finally able to run sam local invoke using docker machine text tmp sam app docker machine scp recursive delta pwd docker machine name pwd sending incremental file list sam app sam app readme md s xfr to chk sam app empty json s xfr to chk sam app issue md s xfr to chk sam app template yaml s xfr to chk sam app aws sam sam app aws sam build sam app aws sam build template yaml s xfr to chk sam app aws sam build helloworldfunction sam app aws sam build helloworldfunction helloworld sam app aws sam build helloworldfunction helloworld app class s xfr to chk sam app aws sam build helloworldfunction helloworld gatewayresponse class s xfr to chk sam app aws sam build helloworldfunction lib sam app aws sam build helloworldfunction lib aws lambda java core jar s xfr to chk sam app helloworldfunction sam app helloworldfunction pom xml s xfr to chk sam app helloworldfunction src sam app helloworldfunction src main sam app helloworldfunction src main java sam app helloworldfunction src main java helloworld sam app helloworldfunction src main java helloworld app java s xfr to chk sam app helloworldfunction src main java helloworld gatewayresponse java s xfr to chk sam app helloworldfunction src test sam app helloworldfunction src test java sam app helloworldfunction src test java helloworld sam app helloworldfunction src test java helloworld apptest java s xfr to chk sam helloworld classnotfound lestephane le studio xps tmp sam app sam local invoke reading invoke payload from stdin you can also pass it from file with event invoking helloworld app handlerequest fetching lambci lambda docker container image mounting tmp sam app aws sam build helloworldfunction as var task ro delegated inside runtime container start requestid version latest end requestid report requestid duration ms billed duration ms memory size mb max memory used mb body message hello world location headers x custom header application json content type application json statuscode caveat hot reloads hot reloading of code will obviously not work basically docker machine users will have to do sam build everytime they change their code and that s ok i think the following statement in the sam output will therefore not be applicable text you do not need to restart reload sam cli while working on your functions changes will be reflected instantly automatically and so reloading is clearly out of scope why not vboxmanage virtualbox shared folders a workaround that requires the user to have to go modify shared folder settings in virtualbox especially when doing a helloworld example adds friction and still does not ensure that the shared folders behave like real folders or have the correct permissions so you end up with more issues from newbies in addition vboxmanage virtual shared folders are virtualbox specific docker machine can also be used with other machine drivers like hyper v and vm ware why not docker machine mount it mounts a filesystem directory of the docker machine vm as a local source directory which is rather useless you want the docker vm to mount the user s source code directory not the other way around and to this day docker machine has no mechanism to achieve that why not nfs x y etc tldr as long as a local docker daemon is present things should work without additional configuration a lot of people are using docker machine knowingly or not the suggested solution shows how the situation can be improved if not automatically through sam at least in a repeatable manual way | 1 |
322,720 | 23,920,321,933 | IssuesEvent | 2022-09-09 16:12:34 | Brain-Bones/skeleton | https://api.github.com/repos/Brain-Bones/skeleton | closed | AppBar navigation links are in incorrect order | bug documentation | 
Should be:
`Guides | Docs | Components | Utilities`
| 1.0 | AppBar navigation links are in incorrect order - 
Should be:
`Guides | Docs | Components | Utilities`
| non_main | appbar navigation links are in incorrect order should be guides docs components utilities | 0 |
1,390 | 6,018,902,415 | IssuesEvent | 2017-06-07 13:26:22 | caskroom/homebrew-cask | https://api.github.com/repos/caskroom/homebrew-cask | closed | Feature request: improve installer message | awaiting maintainer feedback | ### Description of feature/enhancement
i would like to see the installer message should include the command line for the command
### Justification
my tries to run the command ....
```
/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app
Black Screen: command "/usr/local/Caskroom/little-snitch/3.7.4/Little" not found.
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
bash: /usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app: is a directory
open /usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app
The files /usr/local/Caskroom/little-snitch/3.7.4/Little, /Users/muescha/Work/github/WWDC/Snitch, and /Users/muescha/Work/github/WWDC/Installer.app do not exist.
open '/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
```
### Example use case
```
To complete the installation of Cask little-snitch, you must also
run the installer at
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
🍺 little-snitch was successfully installed!
```
to
```
To complete the installation of Cask little-snitch, you must also
run the installer at
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
with command:
open '/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
🍺 little-snitch was successfully installed!
```
| True | Feature request: improve installer message - ### Description of feature/enhancement
i would like to see the installer message should include the command line for the command
### Justification
my tries to run the command ....
```
/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app
Black Screen: command "/usr/local/Caskroom/little-snitch/3.7.4/Little" not found.
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
bash: /usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app: is a directory
open /usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app
The files /usr/local/Caskroom/little-snitch/3.7.4/Little, /Users/muescha/Work/github/WWDC/Snitch, and /Users/muescha/Work/github/WWDC/Installer.app do not exist.
open '/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
```
### Example use case
```
To complete the installation of Cask little-snitch, you must also
run the installer at
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
🍺 little-snitch was successfully installed!
```
to
```
To complete the installation of Cask little-snitch, you must also
run the installer at
'/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
with command:
open '/usr/local/Caskroom/little-snitch/3.7.4/Little Snitch Installer.app'
🍺 little-snitch was successfully installed!
```
| main | feature request improve installer message description of feature enhancement i would like to see the installer message should include the command line for the command justification my tries to run the command usr local caskroom little snitch little snitch installer app black screen command usr local caskroom little snitch little not found usr local caskroom little snitch little snitch installer app bash usr local caskroom little snitch little snitch installer app is a directory open usr local caskroom little snitch little snitch installer app the files usr local caskroom little snitch little users muescha work github wwdc snitch and users muescha work github wwdc installer app do not exist open usr local caskroom little snitch little snitch installer app example use case to complete the installation of cask little snitch you must also run the installer at usr local caskroom little snitch little snitch installer app 🍺 little snitch was successfully installed to to complete the installation of cask little snitch you must also run the installer at usr local caskroom little snitch little snitch installer app with command open usr local caskroom little snitch little snitch installer app 🍺 little snitch was successfully installed | 1 |
5,727 | 30,279,676,196 | IssuesEvent | 2023-07-08 00:43:44 | deislabs/spiderlightning | https://api.github.com/repos/deislabs/spiderlightning | opened | The install.ps1 fails, 'Update-SessionEnvironment' is not recognized with/without Chocolately, able to install, not able to get it to output | 🐛 bug 🚧 maintainer issue | **Description of the bug**
When I first try an install of the CLI slight, it seams to 'Install' but fails to actually run?
>PS C:\Users\zer0> slight
PS C:\Users\zer0> slight -h
PS C:\Users\zer0> slight --help
the error message it provides is as follows:
> PowerShell 7.3.5
PS C:\Users\zer0> iex ((New-ObjectSystem.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1
>>LATEST RELEASE: v0.5.1...
BINARY WILL BE STORED AT C:\slight.
DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
EXTRACTED BINARY TAR.
Update-SessionEnvironment : The term 'Update-SessionEnvironment' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is
correct and try again.
At line:21 char:1
+ Update-SessionEnvironment
+ CategoryInfo : ObjectNotFound: (Update-SessionEnvironment:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
>>INSTALLED BINARY.
>>CLEANED UP.
>PS C:\Users\zer0> ls -l /slight
>> Directory: C:\slight
Mode LastWriteTime Length Name
-a--- 6/2/2023 2:34 PM 38387712 slight.exe
>PS C:\Users\zer0> slight
PS C:\Users\zer0>
### No output here, why?? so I checked my current Environment variables to see if install.ps1 set them
### My Used System Environment:

> PS C:\Users\zer0> [Environment]::GetEnvironmentVariable('path', 'User')
>>C:\ProgramFiles\PowerShell\7;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\dotnet\;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files (x86)\Gpg4win\..\GnuPG\bin;C:\Program Files\PuTTY\;C:\JupyterLab;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\msys64\mingw64\bin;C:\slight
> PS C:\Users\zer0> [Environment]::GetEnvironmentVariable('path', 'process')
>>C:\Program Files\PowerShell\7;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\dotnet\;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files (x86)\Gpg4win\..\GnuPG\bin;C:\Program Files\PuTTY\;C:\JupyterLab;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\msys64\mingw64\bin;C:\Program Files\PowerShell\7\;C:\Users\Administrator\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\Administrator\AppData\Local\Programs\Python\Python311\;C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps;C:\Users\Administrator\.dotnet\tools;C:\Program Files\Git\bin\;C:\Program Files\Git\cmd\;C:\Program Files (x86)\Nmap;C:\Users\Administrator\AppData\Local\Programs\Microsoft VS Code\bin;C:\slight
### YES!, we see here at the last line that : C:\slight is being added to path, and proof its IN the ACTUAL path:

but wait, why is it still not generating anything ??
>PS C:\Users\zer0> slight new -n spidey@v0.5.1 rust && cd spidey
PS C:\Users\zer0> ls
>>Directory: C:\Users\zer0
>>>Mode LastWriteTime Length Name
d---- 6/8/2023 7:53 PM .BigNox
d---- 7/7/2023 12:49 PM .docker
d---- 5/31/2023 7:39 AM .dotnet
d---- 6/1/2023 1:29 AM .ms-ad
d---- 6/17/2023 12:58 PM .VirtualBox
d---- 6/14/2023 7:03 PM .vscode
d-r-- 5/31/2023 10:51 PM Contacts
d-r-- 7/7/2023 12:54 PM Desktop
d-r-- 7/7/2023 3:33 PM Documents
d-r-- 7/6/2023 9:04 PM Downloads
d---- 7/6/2023 9:25 PM eclipse-workspace
d-r-- 5/31/2023 10:51 PM Favorites
d---- 6/19/2023 7:32 PM Fido
d-r-- 5/31/2023 10:51 PM Links
d-r-- 5/31/2023 10:51 PM Music
d-r-- 5/31/2023 10:51 PM Pictures
d---- 7/7/2023 12:56 PM rust-projects
d-r-- 5/31/2023 10:51 PM Saved Games
d-r-- 6/1/2023 4:34 PM Searches
d---- 7/7/2023 4:44 PM spiderlightning
d---- 6/14/2023 7:00 PM sunRay
d-r-- 5/31/2023 10:51 PM Videos
d---- 6/14/2023 7:57 PM VirtualBox VMs
d---- 6/5/2023 4:21 PM wsl2-pico
-a--- 6/8/2023 7:35 PM 45 nuuid.ini
-a--- 6/8/2023 7:35 PM 53 useruid.ini
## I found out that the cmdlt 'Update-SessionEnvironment' was just not available to call. looks to be a Chocolatey CLI based Windows Package manger module [here](https://github.com/chocolatey/choco/blob/d836138142e6b1e7a9b2e931faeb0c4d4fb4b6c2/src/chocolatey.resources/helpers/functions/Update-SessionEnvironment.ps1#L17), (do not confuse with the outdated depreciated version that most may have if using older OS versions, [here](https://github.com/chocolatey-archive/chocolatey/blob/master/src/helpers/functions/Update-SessionEnvironment.ps1)
however I seamed to not have any bin or initialization of it:
>PS C:\Users\zer0> choco
>>choco: The term 'choco' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
SO i installed Choco from [here:](https://github.com/chocolatey/choco)
>PS C:\Users\zer0> choco
>>Chocolatey v2.1.0
Please run 'choco -?' or 'choco <command> -?' for help menu.
### Installed choco and and after re-starting script. no change,
>PS C:\Users\zer0> iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1'))
>> LATEST RELEASE: v0.5.1...
>> BINARY WILL BE STORED AT C:\slight.
>> DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
>> EXTRACTED BINARY TAR.
+ Update-SessionEnvironment:
+ Line | 21 | Update-SessionEnvironment
+ | The term 'Update-SessionEnvironment' is not recognized as a name of a cmdlet, function, script file, or executable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
>>INSTALLED BINARY.
>> CLEANED UP.
**To Reproduce**
### I personally do not know (yet) why this is a Bug or why it is for me at least breaking, I can only explain its behavior in my environment stated above.
**Additional context**
### I Tested in different environments to try and make it work, such as VS code, and alternate PowerShell versions, all except for Linux or UNIX, which I do know very well but already have that env in WSL reserved for something else even though i very much know it would just work under that. YET windows seams to always give us some new worries...
Tested this inside the VS (Visual Studio) code's environment PS shell, and it returns the exact same behavior,
Testing with PS OG shell V1 (the original 'default' version packaged with any win install before upgrading) returns same behavior.

## Testing with Admin rights yields same behavior:

> PS C:\Users\zer0> iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1'))
>>> LATEST RELEASE: v0.5.1...
>>> BINARY WILL BE STORED AT C:\slight.
>>> DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
>>> EXTRACTED BINARY TAR.
Update-SessionEnvironment : The term 'Update-SessionEnvironment' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again.
At line:21 char:1
+ Update-SessionEnvironment
+ CategoryInfo : ObjectNotFound: (Update-SessionEnvironment:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
>>> INSTALLED BINARY.
>>> CLEANED UP.
>PS C:\Users\zer0> slight new -n spidey@v0.5.1 rust && cd spidey
PS C:\Users\zer0>
Am I still missing something?? Do i have to explicitly add Update-SessionEnvironmenT to my environment path? or run them sideby side?
Either way, it should not be scoped like so. | True | The install.ps1 fails, 'Update-SessionEnvironment' is not recognized with/without Chocolately, able to install, not able to get it to output - **Description of the bug**
When I first try an install of the CLI slight, it seams to 'Install' but fails to actually run?
>PS C:\Users\zer0> slight
PS C:\Users\zer0> slight -h
PS C:\Users\zer0> slight --help
the error message it provides is as follows:
> PowerShell 7.3.5
PS C:\Users\zer0> iex ((New-ObjectSystem.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1
>>LATEST RELEASE: v0.5.1...
BINARY WILL BE STORED AT C:\slight.
DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
EXTRACTED BINARY TAR.
Update-SessionEnvironment : The term 'Update-SessionEnvironment' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is
correct and try again.
At line:21 char:1
+ Update-SessionEnvironment
+ CategoryInfo : ObjectNotFound: (Update-SessionEnvironment:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
>>INSTALLED BINARY.
>>CLEANED UP.
>PS C:\Users\zer0> ls -l /slight
>> Directory: C:\slight
Mode LastWriteTime Length Name
-a--- 6/2/2023 2:34 PM 38387712 slight.exe
>PS C:\Users\zer0> slight
PS C:\Users\zer0>
### No output here, why?? so I checked my current Environment variables to see if install.ps1 set them
### My Used System Environment:

> PS C:\Users\zer0> [Environment]::GetEnvironmentVariable('path', 'User')
>>C:\ProgramFiles\PowerShell\7;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\dotnet\;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files (x86)\Gpg4win\..\GnuPG\bin;C:\Program Files\PuTTY\;C:\JupyterLab;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\msys64\mingw64\bin;C:\slight
> PS C:\Users\zer0> [Environment]::GetEnvironmentVariable('path', 'process')
>>C:\Program Files\PowerShell\7;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\dotnet\;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files (x86)\Gpg4win\..\GnuPG\bin;C:\Program Files\PuTTY\;C:\JupyterLab;C:\Program Files\NVIDIA Corporation\NVIDIA NvDLISR;C:\msys64\mingw64\bin;C:\Program Files\PowerShell\7\;C:\Users\Administrator\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\Administrator\AppData\Local\Programs\Python\Python311\;C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps;C:\Users\Administrator\.dotnet\tools;C:\Program Files\Git\bin\;C:\Program Files\Git\cmd\;C:\Program Files (x86)\Nmap;C:\Users\Administrator\AppData\Local\Programs\Microsoft VS Code\bin;C:\slight
### YES!, we see here at the last line that : C:\slight is being added to path, and proof its IN the ACTUAL path:

but wait, why is it still not generating anything ??
>PS C:\Users\zer0> slight new -n spidey@v0.5.1 rust && cd spidey
PS C:\Users\zer0> ls
>>Directory: C:\Users\zer0
>>>Mode LastWriteTime Length Name
d---- 6/8/2023 7:53 PM .BigNox
d---- 7/7/2023 12:49 PM .docker
d---- 5/31/2023 7:39 AM .dotnet
d---- 6/1/2023 1:29 AM .ms-ad
d---- 6/17/2023 12:58 PM .VirtualBox
d---- 6/14/2023 7:03 PM .vscode
d-r-- 5/31/2023 10:51 PM Contacts
d-r-- 7/7/2023 12:54 PM Desktop
d-r-- 7/7/2023 3:33 PM Documents
d-r-- 7/6/2023 9:04 PM Downloads
d---- 7/6/2023 9:25 PM eclipse-workspace
d-r-- 5/31/2023 10:51 PM Favorites
d---- 6/19/2023 7:32 PM Fido
d-r-- 5/31/2023 10:51 PM Links
d-r-- 5/31/2023 10:51 PM Music
d-r-- 5/31/2023 10:51 PM Pictures
d---- 7/7/2023 12:56 PM rust-projects
d-r-- 5/31/2023 10:51 PM Saved Games
d-r-- 6/1/2023 4:34 PM Searches
d---- 7/7/2023 4:44 PM spiderlightning
d---- 6/14/2023 7:00 PM sunRay
d-r-- 5/31/2023 10:51 PM Videos
d---- 6/14/2023 7:57 PM VirtualBox VMs
d---- 6/5/2023 4:21 PM wsl2-pico
-a--- 6/8/2023 7:35 PM 45 nuuid.ini
-a--- 6/8/2023 7:35 PM 53 useruid.ini
## I found out that the cmdlt 'Update-SessionEnvironment' was just not available to call. looks to be a Chocolatey CLI based Windows Package manger module [here](https://github.com/chocolatey/choco/blob/d836138142e6b1e7a9b2e931faeb0c4d4fb4b6c2/src/chocolatey.resources/helpers/functions/Update-SessionEnvironment.ps1#L17), (do not confuse with the outdated depreciated version that most may have if using older OS versions, [here](https://github.com/chocolatey-archive/chocolatey/blob/master/src/helpers/functions/Update-SessionEnvironment.ps1)
however I seamed to not have any bin or initialization of it:
>PS C:\Users\zer0> choco
>>choco: The term 'choco' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
SO i installed Choco from [here:](https://github.com/chocolatey/choco)
>PS C:\Users\zer0> choco
>>Chocolatey v2.1.0
Please run 'choco -?' or 'choco <command> -?' for help menu.
### Installed choco and and after re-starting script. no change,
>PS C:\Users\zer0> iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1'))
>> LATEST RELEASE: v0.5.1...
>> BINARY WILL BE STORED AT C:\slight.
>> DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
>> EXTRACTED BINARY TAR.
+ Update-SessionEnvironment:
+ Line | 21 | Update-SessionEnvironment
+ | The term 'Update-SessionEnvironment' is not recognized as a name of a cmdlet, function, script file, or executable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
>>INSTALLED BINARY.
>> CLEANED UP.
**To Reproduce**
### I personally do not know (yet) why this is a Bug or why it is for me at least breaking, I can only explain its behavior in my environment stated above.
**Additional context**
### I Tested in different environments to try and make it work, such as VS code, and alternate PowerShell versions, all except for Linux or UNIX, which I do know very well but already have that env in WSL reserved for something else even though i very much know it would just work under that. YET windows seams to always give us some new worries...
Tested this inside the VS (Visual Studio) code's environment PS shell, and it returns the exact same behavior,
Testing with PS OG shell V1 (the original 'default' version packaged with any win install before upgrading) returns same behavior.

## Testing with Admin rights yields same behavior:

> PS C:\Users\zer0> iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/deislabs/spiderlightning/main/install.ps1'))
>>> LATEST RELEASE: v0.5.1...
>>> BINARY WILL BE STORED AT C:\slight.
>>> DONLOADING FROM: https://github.com/deislabs/spiderlightning/releases/download/v0.5.1/slight-windows-x86_64.tar.gz...
>>> EXTRACTED BINARY TAR.
Update-SessionEnvironment : The term 'Update-SessionEnvironment' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again.
At line:21 char:1
+ Update-SessionEnvironment
+ CategoryInfo : ObjectNotFound: (Update-SessionEnvironment:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
>>> INSTALLED BINARY.
>>> CLEANED UP.
>PS C:\Users\zer0> slight new -n spidey@v0.5.1 rust && cd spidey
PS C:\Users\zer0>
Am I still missing something?? Do i have to explicitly add Update-SessionEnvironmenT to my environment path? or run them sideby side?
Either way, it should not be scoped like so. | main | the install fails update sessionenvironment is not recognized with without chocolately able to install not able to get it to output description of the bug when i first try an install of the cli slight it seams to install but fails to actually run ps c users slight ps c users slight h ps c users slight help the error message it provides is as follows powershell ps c users iex new objectsystem net webclient downloadstring latest release binary will be stored at c slight donloading from extracted binary tar update sessionenvironment the term update sessionenvironment is not recognized as the name of a cmdlet function script file or operable program check the spelling of the name or if a path was included verify that the path is correct and try again at line char update sessionenvironment categoryinfo objectnotfound update sessionenvironment string commandnotfoundexception fullyqualifiederrorid commandnotfoundexception installed binary cleaned up ps c users ls l slight directory c slight mode lastwritetime length name a pm slight exe ps c users slight ps c users no output here why so i checked my current environment variables to see if install set them my used system environment ps c users getenvironmentvariable path user c programfiles powershell c windows c windows c windows wbem c windows windowspowershell c windows openssh c program files nvidia corporation physx common c program files dotnet c program files docker docker resources bin c program files git cmd c program files gnupg bin c program files putty c jupyterlab c program files nvidia corporation nvidia nvdlisr c bin c slight ps c users getenvironmentvariable path process c program files powershell c windows c windows c windows wbem c windows windowspowershell c windows openssh c program files nvidia corporation physx common c program files dotnet c program files docker docker resources bin c program files git cmd c program files gnupg bin c program files putty c jupyterlab c program files nvidia corporation nvidia nvdlisr c bin c program files powershell c users administrator appdata local programs python scripts c users administrator appdata local programs python c users administrator appdata local microsoft windowsapps c users administrator dotnet tools c program files git bin c program files git cmd c program files nmap c users administrator appdata local programs microsoft vs code bin c slight yes we see here at the last line that c slight is being added to path and proof its in the actual path but wait why is it still not generating anything ps c users slight new n spidey rust cd spidey ps c users ls directory c users mode lastwritetime length name d pm bignox d pm docker d am dotnet d am ms ad d pm virtualbox d pm vscode d r pm contacts d r pm desktop d r pm documents d r pm downloads d pm eclipse workspace d r pm favorites d pm fido d r pm links d r pm music d r pm pictures d pm rust projects d r pm saved games d r pm searches d pm spiderlightning d pm sunray d r pm videos d pm virtualbox vms d pm pico a pm nuuid ini a pm useruid ini i found out that the cmdlt update sessionenvironment was just not available to call looks to be a chocolatey cli based windows package manger module do not confuse with the outdated depreciated version that most may have if using older os versions however i seamed to not have any bin or initialization of it ps c users choco choco the term choco is not recognized as a name of a cmdlet function script file or executable program check the spelling of the name or if a path was included verify that the path is correct and try again so i installed choco from ps c users choco chocolatey please run choco or choco for help menu installed choco and and after re starting script no change ps c users iex new object system net webclient downloadstring latest release binary will be stored at c slight donloading from extracted binary tar update sessionenvironment line update sessionenvironment the term update sessionenvironment is not recognized as a name of a cmdlet function script file or executable program check the spelling of the name or if a path was included verify that the path is correct and try again installed binary cleaned up to reproduce i personally do not know yet why this is a bug or why it is for me at least breaking i can only explain its behavior in my environment stated above additional context i tested in different environments to try and make it work such as vs code and alternate powershell versions all except for linux or unix which i do know very well but already have that env in wsl reserved for something else even though i very much know it would just work under that yet windows seams to always give us some new worries tested this inside the vs visual studio code s environment ps shell and it returns the exact same behavior testing with ps og shell the original default version packaged with any win install before upgrading returns same behavior testing with admin rights yields same behavior ps c users iex new object system net webclient downloadstring latest release binary will be stored at c slight donloading from extracted binary tar update sessionenvironment the term update sessionenvironment is not recognized as the name of a cmdlet function script file or operable program check the spelling of the name or if a path was included verify that the path is correct and try again at line char update sessionenvironment categoryinfo objectnotfound update sessionenvironment string commandnotfoundexception fullyqualifiederrorid commandnotfoundexception installed binary cleaned up ps c users slight new n spidey rust cd spidey ps c users am i still missing something do i have to explicitly add update sessionenvironment to my environment path or run them sideby side either way it should not be scoped like so | 1 |
526,997 | 15,306,394,843 | IssuesEvent | 2021-02-24 19:25:10 | GoogleCloudPlatform/python-docs-samples | https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples | closed | bq://product-stockout.product_stockout.stockout project not found error | :rotating_light: api: automl priority: p2 samples type: bug | ## In which file did you encounter the issue?
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/tables/automl/notebooks/retail_product_stockout_prediction/retail_product_stockout_prediction.ipynb
### Did you change the file? If so, how?
No changes
## Describe the issue
It appears that the project for the BQ dataset is missing - bq://product-stockout.product_stockout.stockout
The following error came up:
```
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The BigQuery dataset bq://product-stockout.product_stockout doesn't exist."
```
| 1.0 | bq://product-stockout.product_stockout.stockout project not found error - ## In which file did you encounter the issue?
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/tables/automl/notebooks/retail_product_stockout_prediction/retail_product_stockout_prediction.ipynb
### Did you change the file? If so, how?
No changes
## Describe the issue
It appears that the project for the BQ dataset is missing - bq://product-stockout.product_stockout.stockout
The following error came up:
```
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The BigQuery dataset bq://product-stockout.product_stockout doesn't exist."
```
| non_main | bq product stockout product stockout stockout project not found error in which file did you encounter the issue did you change the file if so how no changes describe the issue it appears that the project for the bq dataset is missing bq product stockout product stockout stockout the following error came up inactiverpcerror inactiverpcerror of rpc that terminated with status statuscode not found details the bigquery dataset bq product stockout product stockout doesn t exist | 0 |
5,076 | 25,968,154,677 | IssuesEvent | 2022-12-19 09:01:56 | centerofci/mathesar | https://api.github.com/repos/centerofci/mathesar | closed | Fix regression - Allow setting default value | type: bug work: frontend status: ready restricted: maintainers | https://github.com/centerofci/mathesar/pull/1628 removes the Default value option out of the type selector.
We need to add a separate section in the Table Inspector columns tab, to support editing default value as per the [spec](https://github.com/centerofci/mathesar-wiki/blob/master/design/specs/data-type-options.md). | True | Fix regression - Allow setting default value - https://github.com/centerofci/mathesar/pull/1628 removes the Default value option out of the type selector.
We need to add a separate section in the Table Inspector columns tab, to support editing default value as per the [spec](https://github.com/centerofci/mathesar-wiki/blob/master/design/specs/data-type-options.md). | main | fix regression allow setting default value removes the default value option out of the type selector we need to add a separate section in the table inspector columns tab to support editing default value as per the | 1 |
27,462 | 7,966,816,196 | IssuesEvent | 2018-07-15 05:03:49 | angular/angular-cli | https://api.github.com/repos/angular/angular-cli | closed | Halting Problem at 95%. Shows emitting as the action doing. | comp: devkit/build-angular | I was hit by a problem like halting problem of truing machine while using the angular cli as follows.
My script : node --max_old_space_size=4096 ./node_modules/@angular/cli/bin/ng build -e=prod --prod --sourcemap --vendor-chunk=true --build-optimizer
### Versions
```
Angular CLI: 1.7.3
Node: 9.8.0
OS: win32 x64
Angular: 5.2.8
... animations, common, compiler, compiler-cli, core, forms
... http, language-service, platform-browser
... platform-browser-dynamic, platform-server, router
@angular/cli: 1.7.3
@angular-devkit/build-optimizer: 0.3.2
@angular-devkit/core: 0.3.2
@angular-devkit/schematics: 0.3.2
@ngtools/json-schema: 1.2.0
@ngtools/webpack: 1.10.2
@schematics/angular: 0.3.2
@schematics/package-update: 0.3.2
typescript: 2.4.0
webpack-bundle-analyzer: 2.11.1
```
### Repro steps
Use this script : node --max_old_space_size=4096 ./node_modules/@angular/cli/bin/ng build -e=prod --prod --sourcemap --vendor-chunk=true --build-optimizer --stats-json
* Apply the above script to any angular project.
* If we replace --scourcemap with --no-souecemap the problem will not comes.
### Observed behavior
```
The program is running indefinitely.
```
### Desired behavior
The program should halt to say if there is some error in the script.
| 1.0 | Halting Problem at 95%. Shows emitting as the action doing. - I was hit by a problem like halting problem of truing machine while using the angular cli as follows.
My script : node --max_old_space_size=4096 ./node_modules/@angular/cli/bin/ng build -e=prod --prod --sourcemap --vendor-chunk=true --build-optimizer
### Versions
```
Angular CLI: 1.7.3
Node: 9.8.0
OS: win32 x64
Angular: 5.2.8
... animations, common, compiler, compiler-cli, core, forms
... http, language-service, platform-browser
... platform-browser-dynamic, platform-server, router
@angular/cli: 1.7.3
@angular-devkit/build-optimizer: 0.3.2
@angular-devkit/core: 0.3.2
@angular-devkit/schematics: 0.3.2
@ngtools/json-schema: 1.2.0
@ngtools/webpack: 1.10.2
@schematics/angular: 0.3.2
@schematics/package-update: 0.3.2
typescript: 2.4.0
webpack-bundle-analyzer: 2.11.1
```
### Repro steps
Use this script : node --max_old_space_size=4096 ./node_modules/@angular/cli/bin/ng build -e=prod --prod --sourcemap --vendor-chunk=true --build-optimizer --stats-json
* Apply the above script to any angular project.
* If we replace --scourcemap with --no-souecemap the problem will not comes.
### Observed behavior
```
The program is running indefinitely.
```
### Desired behavior
The program should halt to say if there is some error in the script.
| non_main | halting problem at shows emitting as the action doing i was hit by a problem like halting problem of truing machine while using the angular cli as follows my script node max old space size node modules angular cli bin ng build e prod prod sourcemap vendor chunk true build optimizer versions angular cli node os angular animations common compiler compiler cli core forms http language service platform browser platform browser dynamic platform server router angular cli angular devkit build optimizer angular devkit core angular devkit schematics ngtools json schema ngtools webpack schematics angular schematics package update typescript webpack bundle analyzer repro steps use this script node max old space size node modules angular cli bin ng build e prod prod sourcemap vendor chunk true build optimizer stats json apply the above script to any angular project if we replace scourcemap with no souecemap the problem will not comes observed behavior the program is running indefinitely desired behavior the program should halt to say if there is some error in the script | 0 |
75,627 | 25,958,546,554 | IssuesEvent | 2022-12-18 15:11:32 | DependencyTrack/dependency-track | https://api.github.com/repos/DependencyTrack/dependency-track | opened | Component count not reflecting correctly | defect in triage | ### Current Behavior
Component count is changing upon subsequent analysis even though generated BOM has exact same number of components.
**Screenshot of first analysis.** 1415 component & 5 critical findings.

**Screenshot of 2nd analysis.** Component count now reduced to 1202 & only 3 critical reported now.

**Logs of first analysis**
2022-12-18 14:41:28,792 INFO [ProjectResource] Project VacationDummyTest created by Rahul.Mahulkar@tietoevry.com
2022-12-18 14:44:10,173 INFO [BomUploadProcessingTask] Processing CycloneDX BOM uploaded to project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:13,051 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,051 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,087 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,087 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,417 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,417 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:14,633 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:14,640 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:14,641 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:16,351 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:16,351 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:19,233 INFO [BomUploadProcessingTask] Identified 1415 new components
2022-12-18 14:44:19,233 INFO [BomUploadProcessingTask] Processing CycloneDX dependency graph for project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:22,684 INFO [BomUploadProcessingTask] Processed 1415 components and 0 services uploaded to project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:34,881 INFO [InternalAnalysisTask] Starting internal analysis task
2022-12-18 14:44:36,604 INFO [InternalAnalysisTask] Internal analysis complete
2022-12-18 14:44:36,606 INFO [OssIndexAnalysisTask] Starting Sonatype OSS Index analysis task
2022-12-18 14:44:37,812 INFO [OssIndexAnalysisTask] Sonatype OSS Index analysis complete
2022-12-18 14:44:37,813 INFO [PolicyEngine] Evaluating **1415** component(s) against applicable policies
2022-12-18 14:44:47,380 INFO [PolicyEngine] Policy analysis complete
2022-12-18 14:44:47,380 INFO [ProjectMetricsUpdateTask] Executing metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:56,673 INFO [ProjectMetricsUpdateTask] Completed metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67 in 00:09:293
**Logs of 2nd analysis**
2022-12-18 14:52:22,951 INFO [BomUploadProcessingTask] Processing CycloneDX BOM uploaded to project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:28,998 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:28,998 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,006 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,006 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,313 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,313 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,386 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,387 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,390 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,561 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,561 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,618 INFO [BomUploadProcessingTask] Identified 0 new components
2022-12-18 14:52:30,618 INFO [BomUploadProcessingTask] Processing CycloneDX dependency graph for project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:33,627 INFO [BomUploadProcessingTask] Processed 1415 components and 0 services uploaded to project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:58,209 INFO [InternalAnalysisTask] Starting internal analysis task
2022-12-18 14:52:59,616 INFO [InternalAnalysisTask] Internal analysis complete
2022-12-18 14:52:59,617 INFO [OssIndexAnalysisTask] Starting Sonatype OSS Index analysis task
2022-12-18 14:53:00,566 INFO [OssIndexAnalysisTask] Sonatype OSS Index analysis complete
2022-12-18 14:53:00,567 INFO [PolicyEngine] Evaluating **1415** component(s) against applicable policies
2022-12-18 14:53:09,493 INFO [PolicyEngine] Policy analysis complete
2022-12-18 14:53:09,493 INFO [ProjectMetricsUpdateTask] Executing metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:53:12,936 INFO [ProjectMetricsUpdateTask] Completed metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67 in 00:03:443
**Observations:**
1. In both cases same number of components were present in BOM.
2. In both case we are are getting some runtime errors so it shouldn't be impacting only 2nd time
3. Change in components resulting in different number of findings.
### Steps to Reproduce
No special steps I have. Did analysis multiple times.
### Expected Behavior
1. Number of components must remain same on every analysis is BOM has same number of components.
2. Number of finding should remain same as well.
### Dependency-Track Version
4.6.x
### Dependency-Track Distribution
Container Image
### Database Server
PostgreSQL
### Database Server Version
_No response_
### Browser
Google Chrome
### Checklist
- [X] I have read and understand the [contributing guidelines](https://github.com/DependencyTrack/dependency-track/blob/master/CONTRIBUTING.md#filing-issues)
- [X] I have checked the [existing issues](https://github.com/DependencyTrack/dependency-track/issues) for whether this defect was already reported | 1.0 | Component count not reflecting correctly - ### Current Behavior
Component count is changing upon subsequent analysis even though generated BOM has exact same number of components.
**Screenshot of first analysis.** 1415 component & 5 critical findings.

**Screenshot of 2nd analysis.** Component count now reduced to 1202 & only 3 critical reported now.

**Logs of first analysis**
2022-12-18 14:41:28,792 INFO [ProjectResource] Project VacationDummyTest created by Rahul.Mahulkar@tietoevry.com
2022-12-18 14:44:10,173 INFO [BomUploadProcessingTask] Processing CycloneDX BOM uploaded to project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:13,051 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,051 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,087 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,087 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,417 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:13,417 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:14,633 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:14,640 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:14,641 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:44:16,351 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:16,351 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:44:19,233 INFO [BomUploadProcessingTask] Identified 1415 new components
2022-12-18 14:44:19,233 INFO [BomUploadProcessingTask] Processing CycloneDX dependency graph for project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:22,684 INFO [BomUploadProcessingTask] Processed 1415 components and 0 services uploaded to project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:34,881 INFO [InternalAnalysisTask] Starting internal analysis task
2022-12-18 14:44:36,604 INFO [InternalAnalysisTask] Internal analysis complete
2022-12-18 14:44:36,606 INFO [OssIndexAnalysisTask] Starting Sonatype OSS Index analysis task
2022-12-18 14:44:37,812 INFO [OssIndexAnalysisTask] Sonatype OSS Index analysis complete
2022-12-18 14:44:37,813 INFO [PolicyEngine] Evaluating **1415** component(s) against applicable policies
2022-12-18 14:44:47,380 INFO [PolicyEngine] Policy analysis complete
2022-12-18 14:44:47,380 INFO [ProjectMetricsUpdateTask] Executing metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:44:56,673 INFO [ProjectMetricsUpdateTask] Completed metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67 in 00:09:293
**Logs of 2nd analysis**
2022-12-18 14:52:22,951 INFO [BomUploadProcessingTask] Processing CycloneDX BOM uploaded to project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:28,998 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:28,998 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,006 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,006 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,313 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:29,313 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,386 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,387 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,390 ERROR [NpmMetaAnalyzer] Request failure
kong.unirest.UnirestException: java.net.SocketException: Connection reset
at kong.unirest.DefaultInterceptor.onFail(DefaultInterceptor.java:43)
at kong.unirest.CompoundInterceptor.lambda$onFail$2(CompoundInterceptor.java:54)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)
at java.base/java.util.Collections$2.tryAdvance(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)
at java.base/java.util.stream.FindOps$FindOp.evaluateSequential(Unknown Source)
at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
at java.base/java.util.stream.ReferencePipeline.findFirst(Unknown Source)
at kong.unirest.CompoundInterceptor.onFail(CompoundInterceptor.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:138)
at kong.unirest.Client.request(Client.java:57)
at kong.unirest.BaseRequest.request(BaseRequest.java:359)
at kong.unirest.BaseRequest.asJson(BaseRequest.java:244)
at org.dependencytrack.tasks.repositories.NpmMetaAnalyzer.analyze(NpmMetaAnalyzer.java:85)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.analyze(RepositoryMetaAnalyzerTask.java:95)
at org.dependencytrack.tasks.repositories.RepositoryMetaAnalyzerTask.inform(RepositoryMetaAnalyzerTask.java:51)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:101)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset
at java.base/sun.nio.ch.NioSocketImpl.implRead(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl.read(Unknown Source)
at java.base/sun.nio.ch.NioSocketImpl$1.read(Unknown Source)
at java.base/java.net.Socket$SocketInputStream.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.read(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.readHeader(Unknown Source)
at java.base/sun.security.ssl.SSLSocketInputRecord.bytesInCompletePacket(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl.readApplicationRecord(Unknown Source)
at java.base/sun.security.ssl.SSLSocketImpl$AppInputStream.read(Unknown Source)
at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at kong.unirest.apache.ApacheClient.request(ApacheClient.java:129)
... 10 common frames omitted
2022-12-18 14:52:30,561 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,561 WARN [NugetMetaAnalyzer] An error occurred while parsing upload time for a NuGet component - Repo returned: 1900-01-01T00:00:00+00:00
2022-12-18 14:52:30,618 INFO [BomUploadProcessingTask] Identified 0 new components
2022-12-18 14:52:30,618 INFO [BomUploadProcessingTask] Processing CycloneDX dependency graph for project: d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:33,627 INFO [BomUploadProcessingTask] Processed 1415 components and 0 services uploaded to project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:52:58,209 INFO [InternalAnalysisTask] Starting internal analysis task
2022-12-18 14:52:59,616 INFO [InternalAnalysisTask] Internal analysis complete
2022-12-18 14:52:59,617 INFO [OssIndexAnalysisTask] Starting Sonatype OSS Index analysis task
2022-12-18 14:53:00,566 INFO [OssIndexAnalysisTask] Sonatype OSS Index analysis complete
2022-12-18 14:53:00,567 INFO [PolicyEngine] Evaluating **1415** component(s) against applicable policies
2022-12-18 14:53:09,493 INFO [PolicyEngine] Policy analysis complete
2022-12-18 14:53:09,493 INFO [ProjectMetricsUpdateTask] Executing metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67
2022-12-18 14:53:12,936 INFO [ProjectMetricsUpdateTask] Completed metrics update for project d2df083b-e46e-496a-b29c-0962e113dc67 in 00:03:443
**Observations:**
1. In both cases same number of components were present in BOM.
2. In both case we are are getting some runtime errors so it shouldn't be impacting only 2nd time
3. Change in components resulting in different number of findings.
### Steps to Reproduce
No special steps I have. Did analysis multiple times.
### Expected Behavior
1. Number of components must remain same on every analysis is BOM has same number of components.
2. Number of finding should remain same as well.
### Dependency-Track Version
4.6.x
### Dependency-Track Distribution
Container Image
### Database Server
PostgreSQL
### Database Server Version
_No response_
### Browser
Google Chrome
### Checklist
- [X] I have read and understand the [contributing guidelines](https://github.com/DependencyTrack/dependency-track/blob/master/CONTRIBUTING.md#filing-issues)
- [X] I have checked the [existing issues](https://github.com/DependencyTrack/dependency-track/issues) for whether this defect was already reported | non_main | component count not reflecting correctly current behavior component count is changing upon subsequent analysis even though generated bom has exact same number of components screenshot of first analysis component critical findings screenshot of analysis component count now reduced to only critical reported now logs of first analysis info project vacationdummytest created by rahul mahulkar tietoevry com info processing cyclonedx bom uploaded to project warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned info identified new components info processing cyclonedx dependency graph for project info processed components and services uploaded to project info starting internal analysis task info internal analysis complete info starting sonatype oss index analysis task info sonatype oss index analysis complete info evaluating component s against applicable policies info policy analysis complete info executing metrics update for project info completed metrics update for project in logs of analysis info processing cyclonedx bom uploaded to project warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted error request failure kong unirest unirestexception java net socketexception connection reset at kong unirest defaultinterceptor onfail defaultinterceptor java at kong unirest compoundinterceptor lambda onfail compoundinterceptor java at java base java util stream referencepipeline accept unknown source at java base java util collections tryadvance unknown source at java base java util stream referencepipeline foreachwithcancel unknown source at java base java util stream abstractpipeline copyintowithcancel unknown source at java base java util stream abstractpipeline copyinto unknown source at java base java util stream abstractpipeline wrapandcopyinto unknown source at java base java util stream findops findop evaluatesequential unknown source at java base java util stream abstractpipeline evaluate unknown source at java base java util stream referencepipeline findfirst unknown source at kong unirest compoundinterceptor onfail compoundinterceptor java at kong unirest apache apacheclient request apacheclient java at kong unirest client request client java at kong unirest baserequest request baserequest java at kong unirest baserequest asjson baserequest java at org dependencytrack tasks repositories npmmetaanalyzer analyze npmmetaanalyzer java at org dependencytrack tasks repositories repositorymetaanalyzertask analyze repositorymetaanalyzertask java at org dependencytrack tasks repositories repositorymetaanalyzertask inform repositorymetaanalyzertask java at alpine event framework baseeventservice lambda publish baseeventservice java at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source caused by java net socketexception connection reset at java base sun nio ch niosocketimpl implread unknown source at java base sun nio ch niosocketimpl read unknown source at java base sun nio ch niosocketimpl read unknown source at java base java net socket socketinputstream read unknown source at java base sun security ssl sslsocketinputrecord read unknown source at java base sun security ssl sslsocketinputrecord readheader unknown source at java base sun security ssl sslsocketinputrecord bytesincompletepacket unknown source at java base sun security ssl sslsocketimpl readapplicationrecord unknown source at java base sun security ssl sslsocketimpl appinputstream read unknown source at org apache http impl io sessioninputbufferimpl streamread sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl fillbuffer sessioninputbufferimpl java at org apache http impl io sessioninputbufferimpl readline sessioninputbufferimpl java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl conn defaulthttpresponseparser parsehead defaulthttpresponseparser java at org apache http impl io abstractmessageparser parse abstractmessageparser java at org apache http impl defaultbhttpclientconnection receiveresponseheader defaultbhttpclientconnection java at org apache http impl conn cpoolproxy receiveresponseheader cpoolproxy java at org apache http protocol httprequestexecutor doreceiveresponse httprequestexecutor java at org apache http protocol httprequestexecutor execute httprequestexecutor java at org apache http impl execchain mainclientexec execute mainclientexec java at org apache http impl execchain protocolexec execute protocolexec java at org apache http impl execchain retryexec execute retryexec java at org apache http impl execchain redirectexec execute redirectexec java at org apache http impl client internalhttpclient doexecute internalhttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at org apache http impl client closeablehttpclient execute closeablehttpclient java at kong unirest apache apacheclient request apacheclient java common frames omitted warn an error occurred while parsing upload time for a nuget component repo returned warn an error occurred while parsing upload time for a nuget component repo returned info identified new components info processing cyclonedx dependency graph for project info processed components and services uploaded to project info starting internal analysis task info internal analysis complete info starting sonatype oss index analysis task info sonatype oss index analysis complete info evaluating component s against applicable policies info policy analysis complete info executing metrics update for project info completed metrics update for project in observations in both cases same number of components were present in bom in both case we are are getting some runtime errors so it shouldn t be impacting only time change in components resulting in different number of findings steps to reproduce no special steps i have did analysis multiple times expected behavior number of components must remain same on every analysis is bom has same number of components number of finding should remain same as well dependency track version x dependency track distribution container image database server postgresql database server version no response browser google chrome checklist i have read and understand the i have checked the for whether this defect was already reported | 0 |
3,852 | 16,992,178,806 | IssuesEvent | 2021-06-30 22:21:50 | xanmod/linux | https://api.github.com/repos/xanmod/linux | closed | AMD GPU 100% Usage (>65W) When Idle | reported to maintainers | Problem does not occur in 5.12.12 but does occur in 5.12.13.

I have a Radeon VII | True | AMD GPU 100% Usage (>65W) When Idle - Problem does not occur in 5.12.12 but does occur in 5.12.13.

I have a Radeon VII | main | amd gpu usage when idle problem does not occur in but does occur in i have a radeon vii | 1 |
762,863 | 26,733,790,401 | IssuesEvent | 2023-01-30 07:48:38 | googleapis/google-cloud-ruby | https://api.github.com/repos/googleapis/google-cloud-ruby | closed | PubSub subscriber not processing messages after starting sometimes | type: bug api: pubsub priority: p2 :rotating_light: | We have a very similar problem to #8415 but our case is more specific in that our worker (subscriber) does not process a single message upon starting by autoscaling (both time-based and load-based) sometimes. Most of the times, the worker starts and processes messages without any problem but once every few days or week, the worker starts and processes no messages at all, so much so that we have a name for it now - zombie worker.
Below are the logs and metrics captured for a recent zombie worker occurrence.
Oldest unacked message age:

Undelivered messages number:

Expired ack deadlines count:

GRPC warnings in logs:
```
W, [2021-11-11T06:15:25.531965 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:15:25.532056 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636611325.531511420","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:15:25.532766 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:15:25.532830 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636611325.531513935","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:15:25.533273 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:15:25.534701 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:15:25.535033 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:15:25.535089 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:30:25.532826 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:30:25.532899 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636612225.532517910","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:30:25.533315 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:30:25.533533 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:30:25.535103 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:30:25.535164 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636612225.534486605","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:30:25.536260 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:30:25.536758 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
```
At the time of the occurrence, there were 31 other workers running and all are completing jobs except the zombie worker:

We are not sure if this is a client (subscriber) issue or server issue but we are very sure it is not caused by our application because after killing the zombie worker, the messages are re-delivered to another (normal) worker and it has no problem processing the messages.
During another occurrence, we managed to captured the thread status of the zombie worker’s process too:

#### Environment details
- OS: GKE ([Linux](https://cloud.google.com/kubernetes-engine/docs/concepts/node-images#cos-variants))
- Ruby version: 2.5.0
- Gem name and version: google-cloud-pubsub (2.6.1)
#### Steps to reproduce
1. Use local copy of google-cloud-pubsub, remove `@inventory.add response.received_messages` and `register_callback rec_msg` in [stream.rb](https://github.com/googleapis/google-cloud-ruby/blob/main/google-cloud-pubsub/lib/google/cloud/pubsub/subscriber/stream.rb).
2. Start only 1 worker (subscriber) that subscribe to a topic
3. Publish batch of 80 messages to same topic
4. Wait for 15 minutes
The reason I removed these 2 lines of codes is our logs show no jobs completed and no errors thrown, and our metrics show no messages added to inventory so they were never executed anyway. And indeed, with these steps, I am able to reproduce locally the oldest unacked message age and undelivered messages metrics as well as the GRPC warnings in the logs:
```
W, [2021-11-22T15:23:51.262995 #98511] WARN -- : bidi: read-loop failed
W, [2021-11-22T15:23:51.263064 #98511] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1637565831.262731000","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/Users/steve/.rbenv/versions/2.5.8/bin/rake:1:in `each'
W, [2021-11-22T15:23:51.263597 #98511] WARN -- : bidi: read-loop failed
W, [2021-11-22T15:23:51.264036 #98511] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1637565831.263365000","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/Users/steve/.rbenv/versions/2.5.8/bin/rake:1:in `each'
W, [2021-11-22T15:23:51.264693 #98511] WARN -- : bidi-write-loop: send close failed
W, [2021-11-22T15:23:51.265109 #98511] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-22T15:23:51.264293 #98511] WARN -- : bidi-write-loop: send close failed
W, [2021-11-22T15:23:51.265388 #98511] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
```
If I do not remove `@inventory.add response.received_messages`, GRPC::Unavailable warnings will be thrown sporadically for the [empty request made every 30 seconds](https://github.com/googleapis/google-cloud-ruby/blob/main/google-cloud-pubsub/lib/google/cloud/pubsub/subscriber/stream.rb#L67-L72). If I do not remove both lines of codes, no GRPC warnings are thrown and there is no problem at all, even when no messages are published.
However, I am not able to reproduce the unusual expired ack deadlines metric, which leads us to believe the issue might be on the server side or the GRPC communication in between. Is that possible?
#### Code example
These are the codes I use to publish batch of 80 messages to the topic for testing:
```ruby
messages = []
80.times do |n|
message = {}
message['id'] = "#{Time.now.to_i}#{n}"
message['content'] = Faker::Lorem.paragraph(10)
messages << message.to_json
end
pubsub = Google::Cloud::PubSub.new(
project_id: 'PROJECT_ID',
credentials: "/path/to/keyfile.json"
)
topic = pubsub.topic 'TOPIC_NAME'
topic.publish do |batch_publisher|
messages.each do |message|
batch_publisher.publish message
end
end
```
Our worker's subscriber is operated similarly to the [example provided](https://github.com/googleapis/google-cloud-ruby/tree/main/google-cloud-pubsub#example) but with the following configuration:
```ruby
configuration = {
deadline: 10,
streams: 2,
inventory: {
max_outstanding_messages: 80,
max_total_lease_duration: 20
},
threads: { callback: 8, push: 4 }
}
pubsub.subscription('TOPIC_NAME', skip_lookup: true).listen configuration do |received_message|
process received_message
end
``` | 1.0 | PubSub subscriber not processing messages after starting sometimes - We have a very similar problem to #8415 but our case is more specific in that our worker (subscriber) does not process a single message upon starting by autoscaling (both time-based and load-based) sometimes. Most of the times, the worker starts and processes messages without any problem but once every few days or week, the worker starts and processes no messages at all, so much so that we have a name for it now - zombie worker.
Below are the logs and metrics captured for a recent zombie worker occurrence.
Oldest unacked message age:

Undelivered messages number:

Expired ack deadlines count:

GRPC warnings in logs:
```
W, [2021-11-11T06:15:25.531965 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:15:25.532056 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636611325.531511420","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:15:25.532766 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:15:25.532830 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636611325.531513935","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:15:25.533273 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:15:25.534701 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:15:25.535033 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:15:25.535089 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:30:25.532826 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:30:25.532899 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636612225.532517910","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:30:25.533315 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:30:25.533533 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-11T06:30:25.535103 #1] WARN -- : bidi: read-loop failed
W, [2021-11-11T06:30:25.535164 #1] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1636612225.534486605","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/bin/rake:1:in `each'
W, [2021-11-11T06:30:25.536260 #1] WARN -- : bidi-write-loop: send close failed
W, [2021-11-11T06:30:25.536758 #1] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/var/www/app/rails_shared/bundle/ruby/2.5.0/gems/grpc-1.36.0-x86_64-linux/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
```
At the time of the occurrence, there were 31 other workers running and all are completing jobs except the zombie worker:

We are not sure if this is a client (subscriber) issue or server issue but we are very sure it is not caused by our application because after killing the zombie worker, the messages are re-delivered to another (normal) worker and it has no problem processing the messages.
During another occurrence, we managed to captured the thread status of the zombie worker’s process too:

#### Environment details
- OS: GKE ([Linux](https://cloud.google.com/kubernetes-engine/docs/concepts/node-images#cos-variants))
- Ruby version: 2.5.0
- Gem name and version: google-cloud-pubsub (2.6.1)
#### Steps to reproduce
1. Use local copy of google-cloud-pubsub, remove `@inventory.add response.received_messages` and `register_callback rec_msg` in [stream.rb](https://github.com/googleapis/google-cloud-ruby/blob/main/google-cloud-pubsub/lib/google/cloud/pubsub/subscriber/stream.rb).
2. Start only 1 worker (subscriber) that subscribe to a topic
3. Publish batch of 80 messages to same topic
4. Wait for 15 minutes
The reason I removed these 2 lines of codes is our logs show no jobs completed and no errors thrown, and our metrics show no messages added to inventory so they were never executed anyway. And indeed, with these steps, I am able to reproduce locally the oldest unacked message age and undelivered messages metrics as well as the GRPC warnings in the logs:
```
W, [2021-11-22T15:23:51.262995 #98511] WARN -- : bidi: read-loop failed
W, [2021-11-22T15:23:51.263064 #98511] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1637565831.262731000","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/Users/steve/.rbenv/versions/2.5.8/bin/rake:1:in `each'
W, [2021-11-22T15:23:51.263597 #98511] WARN -- : bidi: read-loop failed
W, [2021-11-22T15:23:51.264036 #98511] WARN -- : 4:Deadline Exceeded. debug_error_string:{"created":"@1637565831.263365000","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":81,"grpc_status":4} (GRPC::DeadlineExceeded)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/active_call.rb:29:in `check_status'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:209:in `block in read_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:195:in `read_loop'
/Users/steve/.rbenv/versions/2.5.8/bin/rake:1:in `each'
W, [2021-11-22T15:23:51.264693 #98511] WARN -- : bidi-write-loop: send close failed
W, [2021-11-22T15:23:51.265109 #98511] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
W, [2021-11-22T15:23:51.264293 #98511] WARN -- : bidi-write-loop: send close failed
W, [2021-11-22T15:23:51.265388 #98511] WARN -- : call#run_batch failed somehow (GRPC::Core::CallError)
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `run_batch'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:165:in `write_loop'
/Users/steve/.rbenv/versions/2.5.8/lib/ruby/gems/2.5.0/gems/grpc-1.36.0-universal-darwin/src/ruby/lib/grpc/generic/bidi_call.rb:75:in `block in run_on_client'
```
If I do not remove `@inventory.add response.received_messages`, GRPC::Unavailable warnings will be thrown sporadically for the [empty request made every 30 seconds](https://github.com/googleapis/google-cloud-ruby/blob/main/google-cloud-pubsub/lib/google/cloud/pubsub/subscriber/stream.rb#L67-L72). If I do not remove both lines of codes, no GRPC warnings are thrown and there is no problem at all, even when no messages are published.
However, I am not able to reproduce the unusual expired ack deadlines metric, which leads us to believe the issue might be on the server side or the GRPC communication in between. Is that possible?
#### Code example
These are the codes I use to publish batch of 80 messages to the topic for testing:
```ruby
messages = []
80.times do |n|
message = {}
message['id'] = "#{Time.now.to_i}#{n}"
message['content'] = Faker::Lorem.paragraph(10)
messages << message.to_json
end
pubsub = Google::Cloud::PubSub.new(
project_id: 'PROJECT_ID',
credentials: "/path/to/keyfile.json"
)
topic = pubsub.topic 'TOPIC_NAME'
topic.publish do |batch_publisher|
messages.each do |message|
batch_publisher.publish message
end
end
```
Our worker's subscriber is operated similarly to the [example provided](https://github.com/googleapis/google-cloud-ruby/tree/main/google-cloud-pubsub#example) but with the following configuration:
```ruby
configuration = {
deadline: 10,
streams: 2,
inventory: {
max_outstanding_messages: 80,
max_total_lease_duration: 20
},
threads: { callback: 8, push: 4 }
}
pubsub.subscription('TOPIC_NAME', skip_lookup: true).listen configuration do |received_message|
process received_message
end
``` | non_main | pubsub subscriber not processing messages after starting sometimes we have a very similar problem to but our case is more specific in that our worker subscriber does not process a single message upon starting by autoscaling both time based and load based sometimes most of the times the worker starts and processes messages without any problem but once every few days or week the worker starts and processes no messages at all so much so that we have a name for it now zombie worker below are the logs and metrics captured for a recent zombie worker occurrence oldest unacked message age undelivered messages number expired ack deadlines count grpc warnings in logs w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic active call rb in check status var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in read loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in read loop var www app rails shared bundle ruby bin rake in each w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic active call rb in check status var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in read loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in read loop var www app rails shared bundle ruby bin rake in each w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in run batch var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in write loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in run on client w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in run batch var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in write loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in run on client w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic active call rb in check status var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in read loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in read loop var www app rails shared bundle ruby bin rake in each w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in run batch var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in write loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in run on client w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic active call rb in check status var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in read loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in read loop var www app rails shared bundle ruby bin rake in each w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in run batch var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in write loop var www app rails shared bundle ruby gems grpc linux src ruby lib grpc generic bidi call rb in block in run on client at the time of the occurrence there were other workers running and all are completing jobs except the zombie worker we are not sure if this is a client subscriber issue or server issue but we are very sure it is not caused by our application because after killing the zombie worker the messages are re delivered to another normal worker and it has no problem processing the messages during another occurrence we managed to captured the thread status of the zombie worker’s process too environment details os gke ruby version gem name and version google cloud pubsub steps to reproduce use local copy of google cloud pubsub remove inventory add response received messages and register callback rec msg in start only worker subscriber that subscribe to a topic publish batch of messages to same topic wait for minutes the reason i removed these lines of codes is our logs show no jobs completed and no errors thrown and our metrics show no messages added to inventory so they were never executed anyway and indeed with these steps i am able to reproduce locally the oldest unacked message age and undelivered messages metrics as well as the grpc warnings in the logs w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic active call rb in check status users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in block in read loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in read loop users steve rbenv versions bin rake in each w warn bidi read loop failed w warn deadline exceeded debug error string created description deadline exceeded file src core ext filters deadline deadline filter cc file line grpc status grpc deadlineexceeded users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic active call rb in check status users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in block in read loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in read loop users steve rbenv versions bin rake in each w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in run batch users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in write loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in block in run on client w warn bidi write loop send close failed w warn call run batch failed somehow grpc core callerror users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in run batch users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in write loop users steve rbenv versions lib ruby gems gems grpc universal darwin src ruby lib grpc generic bidi call rb in block in run on client if i do not remove inventory add response received messages grpc unavailable warnings will be thrown sporadically for the if i do not remove both lines of codes no grpc warnings are thrown and there is no problem at all even when no messages are published however i am not able to reproduce the unusual expired ack deadlines metric which leads us to believe the issue might be on the server side or the grpc communication in between is that possible code example these are the codes i use to publish batch of messages to the topic for testing ruby messages times do n message message time now to i n message faker lorem paragraph messages message to json end pubsub google cloud pubsub new project id project id credentials path to keyfile json topic pubsub topic topic name topic publish do batch publisher messages each do message batch publisher publish message end end our worker s subscriber is operated similarly to the but with the following configuration ruby configuration deadline streams inventory max outstanding messages max total lease duration threads callback push pubsub subscription topic name skip lookup true listen configuration do received message process received message end | 0 |
791,209 | 27,855,750,513 | IssuesEvent | 2023-03-20 22:43:47 | firelab/windninja | https://api.github.com/repos/firelab/windninja | closed | Segfault in lcp client | bug priority:high severity:segfault component:core | The lcp client can segfault depending on lcp server responses. We need to prevent a double free on the object returned from `CPLHTTPFetch`. | 1.0 | Segfault in lcp client - The lcp client can segfault depending on lcp server responses. We need to prevent a double free on the object returned from `CPLHTTPFetch`. | non_main | segfault in lcp client the lcp client can segfault depending on lcp server responses we need to prevent a double free on the object returned from cplhttpfetch | 0 |
3,119 | 11,914,840,101 | IssuesEvent | 2020-03-31 14:10:08 | cloud-gov/product | https://api.github.com/repos/cloud-gov/product | closed | Setup a Multi-AZ CloudFoundry Instance within Single Region | contractor-2-troubleshooting contractor-3-maintainability operations | In order to assess the technical complexity of deploying new CF Instances, we want to setup and deploy a new instance of CloudFoundry for prototyping use.
### Additional Items
- [ ] An approximate number of days / hours for each staff member is documented within issue at time of completion. | True | Setup a Multi-AZ CloudFoundry Instance within Single Region - In order to assess the technical complexity of deploying new CF Instances, we want to setup and deploy a new instance of CloudFoundry for prototyping use.
### Additional Items
- [ ] An approximate number of days / hours for each staff member is documented within issue at time of completion. | main | setup a multi az cloudfoundry instance within single region in order to assess the technical complexity of deploying new cf instances we want to setup and deploy a new instance of cloudfoundry for prototyping use additional items an approximate number of days hours for each staff member is documented within issue at time of completion | 1 |
279,097 | 24,198,891,549 | IssuesEvent | 2022-09-24 09:07:21 | jmakhack/myanimelist-cli | https://api.github.com/repos/jmakhack/myanimelist-cli | opened | [TASK] Fix unit testing issues for mya.c | help wanted up for grabs tests C hacktoberfest | ## Task Context
The work to start adding unit tests to this project began in #13.
A pull request was opened here: https://github.com/jmakhack/myanimelist-cli/pull/23
However, though a lot of work went into adding these unit tests, the work was never completed. There are currently issues with getting the tests to run and the project to build successfully.
This task is to work off the branch and code already present on that pull request and fix the issues so that:
1. The project code can compile and build successfully
2. The unit tests added can be successfully run
| 1.0 | [TASK] Fix unit testing issues for mya.c - ## Task Context
The work to start adding unit tests to this project began in #13.
A pull request was opened here: https://github.com/jmakhack/myanimelist-cli/pull/23
However, though a lot of work went into adding these unit tests, the work was never completed. There are currently issues with getting the tests to run and the project to build successfully.
This task is to work off the branch and code already present on that pull request and fix the issues so that:
1. The project code can compile and build successfully
2. The unit tests added can be successfully run
| non_main | fix unit testing issues for mya c task context the work to start adding unit tests to this project began in a pull request was opened here however though a lot of work went into adding these unit tests the work was never completed there are currently issues with getting the tests to run and the project to build successfully this task is to work off the branch and code already present on that pull request and fix the issues so that the project code can compile and build successfully the unit tests added can be successfully run | 0 |
117,521 | 4,717,193,093 | IssuesEvent | 2016-10-16 13:53:11 | dotKom/onlineweb4 | https://api.github.com/repos/dotKom/onlineweb4 | closed | Article titles (and maybe event titles) with long words overlap adjacent article/event. | Package: Article Priority: Medium Status: Available Type: UI | This one might be tricky, i havent really looked into the CSS. But the ideal state would be CSS that breaks the event/article title word for word first, and if a single word overflows the container, it should allow the browser to break the word as well, at the point the next character will overflow the container. This is especially noticable on mobile / tablet device widths.
Correct "orddeling" will be impossible, as the system would need a grammar ruleset, but it will at least prevent the titles from overlapping one another. | 1.0 | Article titles (and maybe event titles) with long words overlap adjacent article/event. - This one might be tricky, i havent really looked into the CSS. But the ideal state would be CSS that breaks the event/article title word for word first, and if a single word overflows the container, it should allow the browser to break the word as well, at the point the next character will overflow the container. This is especially noticable on mobile / tablet device widths.
Correct "orddeling" will be impossible, as the system would need a grammar ruleset, but it will at least prevent the titles from overlapping one another. | non_main | article titles and maybe event titles with long words overlap adjacent article event this one might be tricky i havent really looked into the css but the ideal state would be css that breaks the event article title word for word first and if a single word overflows the container it should allow the browser to break the word as well at the point the next character will overflow the container this is especially noticable on mobile tablet device widths correct orddeling will be impossible as the system would need a grammar ruleset but it will at least prevent the titles from overlapping one another | 0 |
714 | 4,308,245,924 | IssuesEvent | 2016-07-21 12:17:36 | duckduckgo/zeroclickinfo-goodies | https://api.github.com/repos/duckduckgo/zeroclickinfo-goodies | closed | JavaScript Minifier: Code refactor + Design Adjustments | Low-Hanging Fruit Maintainer Input Requested | Lot's of improvements to be made:
### Perl
- Better query validation
- Use `startend` trigger + add more triggers, or keep with `any` but use regex to evaluate the query is valid. We should ignore certain queries like "gulp minify js"
- Add triggers for 'JSON'
- Use `title` property to render Title instead of using `<h5>` in template
- Add a `subtitle` explaining how to use this?
### JS
- Refactor code to use `DDG.build_async` and only display IA after PrettyDiff has loaded
- Look at WhereAmI Goodie for `build_async` example
- Setup default display properties of elements using CSS, instead of modifying via jQuery after render
### CSS & Handlebars
- use `is-hidden` to hide elements by default
- add `tx--##`, `tx-clr--`, and `bg-clr--` classes instead of hardcoded CSS colors to prevent breaking DDG Themes
- use our `btn` class to style the `<button>`
- More details on DDG CSS classes here: **https://duckduckgo.com/styleguide**
- Add CSS for Mobile screens
- We should probably stack the windows always?
- set `font-family: 'monospace'` for `<textareas>`
/cc @MariagraziaAlastra
------
IA Page: http://duck.co/ia/view/js_minify
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @sahildua2305 | True | JavaScript Minifier: Code refactor + Design Adjustments - Lot's of improvements to be made:
### Perl
- Better query validation
- Use `startend` trigger + add more triggers, or keep with `any` but use regex to evaluate the query is valid. We should ignore certain queries like "gulp minify js"
- Add triggers for 'JSON'
- Use `title` property to render Title instead of using `<h5>` in template
- Add a `subtitle` explaining how to use this?
### JS
- Refactor code to use `DDG.build_async` and only display IA after PrettyDiff has loaded
- Look at WhereAmI Goodie for `build_async` example
- Setup default display properties of elements using CSS, instead of modifying via jQuery after render
### CSS & Handlebars
- use `is-hidden` to hide elements by default
- add `tx--##`, `tx-clr--`, and `bg-clr--` classes instead of hardcoded CSS colors to prevent breaking DDG Themes
- use our `btn` class to style the `<button>`
- More details on DDG CSS classes here: **https://duckduckgo.com/styleguide**
- Add CSS for Mobile screens
- We should probably stack the windows always?
- set `font-family: 'monospace'` for `<textareas>`
/cc @MariagraziaAlastra
------
IA Page: http://duck.co/ia/view/js_minify
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @sahildua2305 | main | javascript minifier code refactor design adjustments lot s of improvements to be made perl better query validation use startend trigger add more triggers or keep with any but use regex to evaluate the query is valid we should ignore certain queries like gulp minify js add triggers for json use title property to render title instead of using in template add a subtitle explaining how to use this js refactor code to use ddg build async and only display ia after prettydiff has loaded look at whereami goodie for build async example setup default display properties of elements using css instead of modifying via jquery after render css handlebars use is hidden to hide elements by default add tx tx clr and bg clr classes instead of hardcoded css colors to prevent breaking ddg themes use our btn class to style the more details on ddg css classes here add css for mobile screens we should probably stack the windows always set font family monospace for cc mariagraziaalastra ia page | 1 |
184,974 | 6,717,848,176 | IssuesEvent | 2017-10-15 03:40:51 | RagtagOpen/nomad | https://api.github.com/repos/RagtagOpen/nomad | opened | Privacy & Terms: incorrect link in last paragraph | priority qa requested | The last paragraph has an incorrect type of link for an email address
Should be similar to:
<a href="mailto:nomad-help@ragtag.org">
```
MISC. Users complying with prior written licenses may access Nomad thereby until authorization is terminated. Otherwise, this is the exclusive and entire agreement between us. If a TOU term is unenforceable, other terms are unaffected. If TOU translations conflict with the English version, English controls. See Privacy Policy for how we collect, use and share data. If you feel content infringes your rights email nomad-help@ragtag.org.
```
```
<p>MISC. Users complying with prior written licenses may access Nomad thereby until authorization is terminated. Otherwise, this is the exclusive and entire agreement between us. If a TOU term is unenforceable, other terms are unaffected. If TOU translations conflict with the English version, English controls. See Privacy Policy for how we collect, use and share data. If you feel content infringes your rights email <a href="nomad-help@ragtag.org">nomad-help@ragtag.org</a>.</p>
```
<img width="1534" alt="screen shot 2017-10-14 at 11 40 28 pm" src="https://user-images.githubusercontent.com/17053096/31581399-141fa17a-b139-11e7-9846-4763888ae3bc.png">
| 1.0 | Privacy & Terms: incorrect link in last paragraph - The last paragraph has an incorrect type of link for an email address
Should be similar to:
<a href="mailto:nomad-help@ragtag.org">
```
MISC. Users complying with prior written licenses may access Nomad thereby until authorization is terminated. Otherwise, this is the exclusive and entire agreement between us. If a TOU term is unenforceable, other terms are unaffected. If TOU translations conflict with the English version, English controls. See Privacy Policy for how we collect, use and share data. If you feel content infringes your rights email nomad-help@ragtag.org.
```
```
<p>MISC. Users complying with prior written licenses may access Nomad thereby until authorization is terminated. Otherwise, this is the exclusive and entire agreement between us. If a TOU term is unenforceable, other terms are unaffected. If TOU translations conflict with the English version, English controls. See Privacy Policy for how we collect, use and share data. If you feel content infringes your rights email <a href="nomad-help@ragtag.org">nomad-help@ragtag.org</a>.</p>
```
<img width="1534" alt="screen shot 2017-10-14 at 11 40 28 pm" src="https://user-images.githubusercontent.com/17053096/31581399-141fa17a-b139-11e7-9846-4763888ae3bc.png">
| non_main | privacy terms incorrect link in last paragraph the last paragraph has an incorrect type of link for an email address should be similar to misc users complying with prior written licenses may access nomad thereby until authorization is terminated otherwise this is the exclusive and entire agreement between us if a tou term is unenforceable other terms are unaffected if tou translations conflict with the english version english controls see privacy policy for how we collect use and share data if you feel content infringes your rights email nomad help ragtag org misc users complying with prior written licenses may access nomad thereby until authorization is terminated otherwise this is the exclusive and entire agreement between us if a tou term is unenforceable other terms are unaffected if tou translations conflict with the english version english controls see privacy policy for how we collect use and share data if you feel content infringes your rights email nomad help ragtag org img width alt screen shot at pm src | 0 |
3,950 | 17,910,025,123 | IssuesEvent | 2021-09-09 03:04:20 | tgstation/tgstation-server | https://api.github.com/repos/tgstation/tgstation-server | closed | Allow updating to version 5 | Maintainability Issue Ready | Now that we know V5 isn't going to be a total rewrite, we should allow an automatic update to V5 ONLY if .NET 6 can be found on the system. | True | Allow updating to version 5 - Now that we know V5 isn't going to be a total rewrite, we should allow an automatic update to V5 ONLY if .NET 6 can be found on the system. | main | allow updating to version now that we know isn t going to be a total rewrite we should allow an automatic update to only if net can be found on the system | 1 |
417,081 | 28,110,096,423 | IssuesEvent | 2023-03-31 06:19:55 | zuohui48/ped | https://api.github.com/repos/zuohui48/ped | opened | Command Summary link does not redirect | type.DocumentationBug severity.VeryLow | 
The Command Summary link in the screenshot above does not redirect me to the command summary section in the UG.
<!--session: 1680242607259-ad0b569e-6232-426d-a1a5-7bc431cac4f0-->
<!--Version: Web v3.4.7--> | 1.0 | Command Summary link does not redirect - 
The Command Summary link in the screenshot above does not redirect me to the command summary section in the UG.
<!--session: 1680242607259-ad0b569e-6232-426d-a1a5-7bc431cac4f0-->
<!--Version: Web v3.4.7--> | non_main | command summary link does not redirect the command summary link in the screenshot above does not redirect me to the command summary section in the ug | 0 |
707,080 | 24,294,201,011 | IssuesEvent | 2022-09-29 08:46:23 | AY2223S1-CS2103-W14-1/tp | https://api.github.com/repos/AY2223S1-CS2103-W14-1/tp | closed | Update README contents | type.Task priority.Medium | As we transition to making the documentation fully ready for Condonery, it is necessary to update the README. | 1.0 | Update README contents - As we transition to making the documentation fully ready for Condonery, it is necessary to update the README. | non_main | update readme contents as we transition to making the documentation fully ready for condonery it is necessary to update the readme | 0 |
3,401 | 13,181,786,240 | IssuesEvent | 2020-08-12 14:49:43 | duo-labs/cloudmapper | https://api.github.com/repos/duo-labs/cloudmapper | closed | Cant filter on tags | map unmaintained_functionality | Running: python cloudmapper.py prepare --account main --tags tagkey=tagvalue
I'm running this within a venv and it ends up building data with 0 nodes and then erroring out
(File "_pyjq.pyx", line 209, in _pyjq.Script.all (_pyjq.c:2561)
_pyjq.ScriptRuntimeError: Cannot iterate over null (null)) - presumably because there are no nodes.
The tag is used though so it should populate some resources - i have tried multiple ways of formatting the command with the same results.
| True | Cant filter on tags - Running: python cloudmapper.py prepare --account main --tags tagkey=tagvalue
I'm running this within a venv and it ends up building data with 0 nodes and then erroring out
(File "_pyjq.pyx", line 209, in _pyjq.Script.all (_pyjq.c:2561)
_pyjq.ScriptRuntimeError: Cannot iterate over null (null)) - presumably because there are no nodes.
The tag is used though so it should populate some resources - i have tried multiple ways of formatting the command with the same results.
| main | cant filter on tags running python cloudmapper py prepare account main tags tagkey tagvalue i m running this within a venv and it ends up building data with nodes and then erroring out file pyjq pyx line in pyjq script all pyjq c pyjq scriptruntimeerror cannot iterate over null null presumably because there are no nodes the tag is used though so it should populate some resources i have tried multiple ways of formatting the command with the same results | 1 |
3,538 | 13,924,948,864 | IssuesEvent | 2020-10-21 16:11:24 | grey-software/LinkedIn-Focus | https://api.github.com/repos/grey-software/LinkedIn-Focus | closed | 🚀 Feature Request: Add Grey Software sticker & Project logo | Domain: User Experience Role: Maintainer Type: Enhancement hacktoberfest-accepted | ### Problem Overview 👁️🗨️
Users should be able to see the Grey Software sticker and the LinkedIn-Focus logo on the README.md file.
### What would you like? 🧰
We want the Grey Software sticker and the LinkedIn-Focus logo as the header on the README.md file. Below is an example. You would also need to add the Grey Software sticker next to it.

### What alternatives have you considered? 🔍
N/A
### Additional details ℹ️
The Linked-In Focus logo image can be found in this repository under the **src** folder. The file name is **icon.png**. Below is the image of the Grey Software sticker that should be used.

| True | 🚀 Feature Request: Add Grey Software sticker & Project logo - ### Problem Overview 👁️🗨️
Users should be able to see the Grey Software sticker and the LinkedIn-Focus logo on the README.md file.
### What would you like? 🧰
We want the Grey Software sticker and the LinkedIn-Focus logo as the header on the README.md file. Below is an example. You would also need to add the Grey Software sticker next to it.

### What alternatives have you considered? 🔍
N/A
### Additional details ℹ️
The Linked-In Focus logo image can be found in this repository under the **src** folder. The file name is **icon.png**. Below is the image of the Grey Software sticker that should be used.

| main | 🚀 feature request add grey software sticker project logo problem overview 👁️🗨️ users should be able to see the grey software sticker and the linkedin focus logo on the readme md file what would you like 🧰 we want the grey software sticker and the linkedin focus logo as the header on the readme md file below is an example you would also need to add the grey software sticker next to it what alternatives have you considered 🔍 n a additional details ℹ️ the linked in focus logo image can be found in this repository under the src folder the file name is icon png below is the image of the grey software sticker that should be used | 1 |
440,305 | 30,740,622,314 | IssuesEvent | 2023-07-28 11:10:17 | SAP/terraform-provider-btp | https://api.github.com/repos/SAP/terraform-provider-btp | closed | [FEATURE] Setup of automated release notes | bug documentation | Ater release 0.1.0 we added the configuration to automatically create release notes based on the PRs that are part of the release (see <https://github.com/SAP/terraform-provider-btp/blob/main/.github/release.yml>).
With released 0.2.0 we learned that there is still some gaps as the texts are not picked up.
See also <https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes>
As far as I see we do not have the option to trigger automated release notes via GitHub in the `goreleaser.yaml`. This seems to rely on a own mechanism (see <https://goreleaser.com/customization/changelog/>). Using the automated GitHub feature probably makes it necessary to leverage the API in a dedicated step <https://docs.github.com/en/rest/releases/releases?apiVersion=2022-11-28#generate-release-notes-content-for-a-release>
A even better alternative would be to contribute to goreleaser and enhance the GitHub client of the project (see <https://github.com/goreleaser/goreleaser/blob/main/internal/client/github.go>) | 1.0 | [FEATURE] Setup of automated release notes - Ater release 0.1.0 we added the configuration to automatically create release notes based on the PRs that are part of the release (see <https://github.com/SAP/terraform-provider-btp/blob/main/.github/release.yml>).
With released 0.2.0 we learned that there is still some gaps as the texts are not picked up.
See also <https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes>
As far as I see we do not have the option to trigger automated release notes via GitHub in the `goreleaser.yaml`. This seems to rely on a own mechanism (see <https://goreleaser.com/customization/changelog/>). Using the automated GitHub feature probably makes it necessary to leverage the API in a dedicated step <https://docs.github.com/en/rest/releases/releases?apiVersion=2022-11-28#generate-release-notes-content-for-a-release>
A even better alternative would be to contribute to goreleaser and enhance the GitHub client of the project (see <https://github.com/goreleaser/goreleaser/blob/main/internal/client/github.go>) | non_main | setup of automated release notes ater release we added the configuration to automatically create release notes based on the prs that are part of the release see with released we learned that there is still some gaps as the texts are not picked up see also as far as i see we do not have the option to trigger automated release notes via github in the goreleaser yaml this seems to rely on a own mechanism see using the automated github feature probably makes it necessary to leverage the api in a dedicated step a even better alternative would be to contribute to goreleaser and enhance the github client of the project see | 0 |
1,581 | 6,572,347,505 | IssuesEvent | 2017-09-11 01:36:03 | ansible/ansible-modules-extras | https://api.github.com/repos/ansible/ansible-modules-extras | closed | Expect module not working module failure parsed false | affects_2.1 bug_report waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
- expect module
##### COMPONENT NAME
Expect module
##### ANSIBLE VERSION
```
ansible 2.1.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### CONFIGURATION
source: RHEL 6.8 pexpect 4.2
target: RHEL 6.8 pexpect 4.2
##### OS / ENVIRONMENT
RHEL 6.8
##### SUMMARY
Module failure related to the python not sure
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: configuring stamp pdf batch
expect:
command: ./install.sh
responses:
'This will install stamppdf on your system. Do you want to continue[y/N]?': y
'--More--': \r
'(?i)Do you agree with this copyright\? \[y/N\]': y
'AP_FONT_DIR [/usr/local/fonts]': y
'Now you must enter a valid registration number': '{{ stamp_pdf_key }}'
echo: yes
args:
chdir: '{{ dest_dir }}/StampPDFBatch_60_L26_64'
become: yes
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
play to work fine
##### ACTUAL RESULTS
```
fatal: [10.135.232.213]: FAILED! => {"changed": false, "failed": true,
"invocation": {"module_name": "expect"}, "module_stderr": "",
"module_stdout": "Traceback (most recent call last):\r\n File
\"/tmp/ansible_V_modk/ansible_module_expect.py\", line 230, in <module>\r\n
main()\r\n File \"/tmp/ansible_V_modk/ansible_module_expect.py\", line 199,
in main\r\n events=events, cwd=chdir, echo=echo)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 225, in
runu\r\n env=env, _spawn=spawnu, **kwargs)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 246, in
_run\r\n index = child.expect(patterns)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1451, in
expect\r\n timeout, searchwindowsize)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1466, in
expect_list\r\n timeout, searchwindowsize)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1535, in
expect_loop\r\n c = self.read_nonblocking(self.maxread, timeout)\r\n
File \"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 984, in
read_nonblocking\r\n s = self._coerce_read_string(s)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1797, in
_coerce_read_string\r\n return self._decoder.decode(s, final=False)\r\n
File \"/usr/lib64/python2.6/codecs.py\", line 296, in decode\r\n
(result, consumed) = self._buffer_decode(data, self.errors,
final)\r\nUnicodeDecodeError: 'utf8' codec can't decode byte 0xd2 in
position 1166: invalid continuation byte\r\n", "msg": "MODULE FAILURE",
"parsed": false}
```
| True | Expect module not working module failure parsed false - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
- expect module
##### COMPONENT NAME
Expect module
##### ANSIBLE VERSION
```
ansible 2.1.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### CONFIGURATION
source: RHEL 6.8 pexpect 4.2
target: RHEL 6.8 pexpect 4.2
##### OS / ENVIRONMENT
RHEL 6.8
##### SUMMARY
Module failure related to the python not sure
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: configuring stamp pdf batch
expect:
command: ./install.sh
responses:
'This will install stamppdf on your system. Do you want to continue[y/N]?': y
'--More--': \r
'(?i)Do you agree with this copyright\? \[y/N\]': y
'AP_FONT_DIR [/usr/local/fonts]': y
'Now you must enter a valid registration number': '{{ stamp_pdf_key }}'
echo: yes
args:
chdir: '{{ dest_dir }}/StampPDFBatch_60_L26_64'
become: yes
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
play to work fine
##### ACTUAL RESULTS
```
fatal: [10.135.232.213]: FAILED! => {"changed": false, "failed": true,
"invocation": {"module_name": "expect"}, "module_stderr": "",
"module_stdout": "Traceback (most recent call last):\r\n File
\"/tmp/ansible_V_modk/ansible_module_expect.py\", line 230, in <module>\r\n
main()\r\n File \"/tmp/ansible_V_modk/ansible_module_expect.py\", line 199,
in main\r\n events=events, cwd=chdir, echo=echo)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 225, in
runu\r\n env=env, _spawn=spawnu, **kwargs)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 246, in
_run\r\n index = child.expect(patterns)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1451, in
expect\r\n timeout, searchwindowsize)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1466, in
expect_list\r\n timeout, searchwindowsize)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1535, in
expect_loop\r\n c = self.read_nonblocking(self.maxread, timeout)\r\n
File \"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 984, in
read_nonblocking\r\n s = self._coerce_read_string(s)\r\n File
\"/usr/lib/python2.6/site-packages/pexpect/__init__.py\", line 1797, in
_coerce_read_string\r\n return self._decoder.decode(s, final=False)\r\n
File \"/usr/lib64/python2.6/codecs.py\", line 296, in decode\r\n
(result, consumed) = self._buffer_decode(data, self.errors,
final)\r\nUnicodeDecodeError: 'utf8' codec can't decode byte 0xd2 in
position 1166: invalid continuation byte\r\n", "msg": "MODULE FAILURE",
"parsed": false}
```
| main | expect module not working module failure parsed false issue type bug report expect module component name expect module ansible version ansible config file etc ansible ansible cfg configured module search path default w o overrides configuration source rhel pexpect target rhel pexpect os environment rhel summary module failure related to the python not sure steps to reproduce for bugs show exactly how to reproduce the problem for new features show how the feature would be used name configuring stamp pdf batch expect command install sh responses this will install stamppdf on your system do you want to continue y more r i do you agree with this copyright y ap font dir y now you must enter a valid registration number stamp pdf key echo yes args chdir dest dir stamppdfbatch become yes expected results play to work fine actual results fatal failed changed false failed true invocation module name expect module stderr module stdout traceback most recent call last r n file tmp ansible v modk ansible module expect py line in r n main r n file tmp ansible v modk ansible module expect py line in main r n events events cwd chdir echo echo r n file usr lib site packages pexpect init py line in runu r n env env spawn spawnu kwargs r n file usr lib site packages pexpect init py line in run r n index child expect patterns r n file usr lib site packages pexpect init py line in expect r n timeout searchwindowsize r n file usr lib site packages pexpect init py line in expect list r n timeout searchwindowsize r n file usr lib site packages pexpect init py line in expect loop r n c self read nonblocking self maxread timeout r n file usr lib site packages pexpect init py line in read nonblocking r n s self coerce read string s r n file usr lib site packages pexpect init py line in coerce read string r n return self decoder decode s final false r n file usr codecs py line in decode r n result consumed self buffer decode data self errors final r nunicodedecodeerror codec can t decode byte in position invalid continuation byte r n msg module failure parsed false | 1 |
3,844 | 16,826,604,054 | IssuesEvent | 2021-06-17 19:30:06 | ansible/ansible | https://api.github.com/repos/ansible/ansible | closed | use ansible_docker with mounts options, always return "invalid character" on path | affects_2.9 bot_closed bug cloud deprecated docker module needs_maintainer support:community | ### Summary
Following this exemple: https://github.com/ansible/ansible/issues/67698
When use mounts option docker always return path trouble.
mounts:
- source: /home/centos
target: /tmp
FAILED! => {"changed": false, "msg": "Error creating container: 400 Client Error: Bad Request (\"create /home/centos: \"/home/centos\" includes invalid characters for a local volume name, only \"[a-zA-Z0-9][a-zA-Z0-9_.-]\" are allowed. If you intended to pass a host directory, use absolute path\")"}
### Issue Type
Bug Report
### Component Name
docker
### Ansible Version
```console
$ ansible --version
ansible 2.9.21
config file = /root/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible
python version = 2.7.5 (default, Aug 7 2019, 00:51:29) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
But:
using: ansible_python_interpreter=/usr/bin/python3
python36-docker-pycreds-0.2.1-2.el7.noarch
python36-docker-2.6.1-3.el7.noarch
```
### Configuration
```console
$ ANSIBLE_SSH_ARGS(/root/ansible/ansible.cfg) = -F ./ssh.cfg -o ControlMaster=auto -o ControlPersist=30m
ANSIBLE_SSH_CONTROL_PATH(/root/ansible/ansible.cfg) = ~/.ssh/ansible-%%r@%%h:%%p
HOST_KEY_CHECKING(/root/ansible/ansible.cfg) = False
```
### OS / Environment
CentOS Linux release 7.7.1908 (Core)
### Steps to Reproduce
install ansible / python 3 by yum, and set invetory with python3 executable path
and run playbook such as under
---
- name: simple
hosts: some.machine
become: yes
tasks:
#
# web server
#
- name: Start simple nginx docker container
docker_container:
name: nginx
image: nginx
state: started
mounts:
- source: /home/centos
target: /tmp
### Expected Results
container run, and directory will mapped
tested with many paths
### Actual Results
```console
FAILED! => {"changed": false, "msg": "Error creating container: 400 Client Error: Bad Request (\"create /home/centos: \"/home/centos\" includes invalid characters for a local volume name, only \"[a-zA-Z0-9][a-zA-Z0-9_.-]\" are allowed. If you intended to pass a host directory, use absolute path\")"}
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct | True | use ansible_docker with mounts options, always return "invalid character" on path - ### Summary
Following this exemple: https://github.com/ansible/ansible/issues/67698
When use mounts option docker always return path trouble.
mounts:
- source: /home/centos
target: /tmp
FAILED! => {"changed": false, "msg": "Error creating container: 400 Client Error: Bad Request (\"create /home/centos: \"/home/centos\" includes invalid characters for a local volume name, only \"[a-zA-Z0-9][a-zA-Z0-9_.-]\" are allowed. If you intended to pass a host directory, use absolute path\")"}
### Issue Type
Bug Report
### Component Name
docker
### Ansible Version
```console
$ ansible --version
ansible 2.9.21
config file = /root/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible
python version = 2.7.5 (default, Aug 7 2019, 00:51:29) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
But:
using: ansible_python_interpreter=/usr/bin/python3
python36-docker-pycreds-0.2.1-2.el7.noarch
python36-docker-2.6.1-3.el7.noarch
```
### Configuration
```console
$ ANSIBLE_SSH_ARGS(/root/ansible/ansible.cfg) = -F ./ssh.cfg -o ControlMaster=auto -o ControlPersist=30m
ANSIBLE_SSH_CONTROL_PATH(/root/ansible/ansible.cfg) = ~/.ssh/ansible-%%r@%%h:%%p
HOST_KEY_CHECKING(/root/ansible/ansible.cfg) = False
```
### OS / Environment
CentOS Linux release 7.7.1908 (Core)
### Steps to Reproduce
install ansible / python 3 by yum, and set invetory with python3 executable path
and run playbook such as under
---
- name: simple
hosts: some.machine
become: yes
tasks:
#
# web server
#
- name: Start simple nginx docker container
docker_container:
name: nginx
image: nginx
state: started
mounts:
- source: /home/centos
target: /tmp
### Expected Results
container run, and directory will mapped
tested with many paths
### Actual Results
```console
FAILED! => {"changed": false, "msg": "Error creating container: 400 Client Error: Bad Request (\"create /home/centos: \"/home/centos\" includes invalid characters for a local volume name, only \"[a-zA-Z0-9][a-zA-Z0-9_.-]\" are allowed. If you intended to pass a host directory, use absolute path\")"}
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct | main | use ansible docker with mounts options always return invalid character on path summary following this exemple when use mounts option docker always return path trouble mounts source home centos target tmp failed changed false msg error creating container client error bad request create home centos home centos includes invalid characters for a local volume name only are allowed if you intended to pass a host directory use absolute path issue type bug report component name docker ansible version console ansible version ansible config file root ansible ansible cfg configured module search path ansible python module location usr lib site packages ansible executable location bin ansible python version default aug but using ansible python interpreter usr bin docker pycreds noarch docker noarch configuration console ansible ssh args root ansible ansible cfg f ssh cfg o controlmaster auto o controlpersist ansible ssh control path root ansible ansible cfg ssh ansible r h p host key checking root ansible ansible cfg false os environment centos linux release core steps to reproduce install ansible python by yum and set invetory with executable path and run playbook such as under name simple hosts some machine become yes tasks web server name start simple nginx docker container docker container name nginx image nginx state started mounts source home centos target tmp expected results container run and directory will mapped tested with many paths actual results console failed changed false msg error creating container client error bad request create home centos home centos includes invalid characters for a local volume name only are allowed if you intended to pass a host directory use absolute path code of conduct i agree to follow the ansible code of conduct | 1 |
61,969 | 17,023,821,591 | IssuesEvent | 2021-07-03 04:01:56 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | OpenID login failure | Component: website Priority: minor Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 10.47pm, Monday, 10th September 2012]**
I have previously logged into osm.org using an OpenID account provided by launchpad.net. At present, however, this fails with the error:
"Sorry, could not log in with those details."
Login with my username / password works as expected. | 1.0 | OpenID login failure - **[Submitted to the original trac issue database at 10.47pm, Monday, 10th September 2012]**
I have previously logged into osm.org using an OpenID account provided by launchpad.net. At present, however, this fails with the error:
"Sorry, could not log in with those details."
Login with my username / password works as expected. | non_main | openid login failure i have previously logged into osm org using an openid account provided by launchpad net at present however this fails with the error sorry could not log in with those details login with my username password works as expected | 0 |
654 | 4,165,953,723 | IssuesEvent | 2016-06-19 20:56:40 | caskroom/homebrew-cask | https://api.github.com/repos/caskroom/homebrew-cask | closed | Verify sling `url` (need maintainer in the US) | awaiting maintainer feedback cask | As part of https://github.com/caskroom/homebrew-cask/issues/19450, I’m trying to verify [`sling`](https://github.com/caskroom/homebrew-cask/blob/master/Casks/sling.rb).
However, I can’t:

Need a maintainer in the US to please do that. | True | Verify sling `url` (need maintainer in the US) - As part of https://github.com/caskroom/homebrew-cask/issues/19450, I’m trying to verify [`sling`](https://github.com/caskroom/homebrew-cask/blob/master/Casks/sling.rb).
However, I can’t:

Need a maintainer in the US to please do that. | main | verify sling url need maintainer in the us as part of i’m trying to verify however i can’t need a maintainer in the us to please do that | 1 |
2,221 | 7,837,794,730 | IssuesEvent | 2018-06-18 07:56:16 | Subsurface-divelog/subsurface | https://api.github.com/repos/Subsurface-divelog/subsurface | closed | Confused about SAC rate answer | dive-computer dive-details needs-maintainer-feedback possible-bug | <!-- Lines like this one are comments and will not be shown in the final output. -->
<!-- If you are a collaborator, please add labels and assign other collaborators for a review. -->
### Describe the issue:
<!-- Replace [ ] with [x] to select options. -->
- [x] Bug?
- [ ] Change request
- [ ] New feature request
- [ ] Discussion request
### Issue long description:
I've recently gotten a Perdix AI and imported my first dive and the SAC value doesn't match that of what I've seen from another application that I imported the same dive to... Based on the start and end pressure numbers from the equipment tab I get this calculation: 3130-1648 = 1482 PSI consumed; 1482/25minutes = 59.28PSI/min; Average Depth = 2ATA So 59.28/2 = 29.64PSI/min@1ATA; Working pressure = 3442PSI; 100/3442 = .02905 * 29.64 = 0.861cuft/min.
This value matches the result returned by the other application. Yet Subsurface returns 0.77cuft/min.
Also I find it strange that the starting and ending pressures in the graph are 3156 and 1644 as opposed to the 3130-1648 shown in the equipment tab.
### Operating system:
<!-- What OS are you running, including OS version and the language that you are running in -->
MacOS 10.13
<!-- What device are you using? -->
Apple MBP
<!-- Only answer this question if you have tried: Does the same happen on another OS? -->
Yes, shows same value in Linux, Windows, Android and MacOS.
### Subsurface version:
<!-- What version of Subsurface are you running? -->
4.7.8
<!-- Does the same happen on another Subsurface version? -->
<!-- Are you using official release, test build, or compiled yourself? -->
<!-- Provide Git hash if your are building Subsurface yourself. -->
### Steps to reproduce:
<!-- Provide reproduction steps separated with new lines - 1), 2), 3)... -->
### Current behavior:
<!-- What is the current behavior? -->
0.77cuft/min
### Expected behavior:
<!-- What is the expected behavior? -->
0.86cuft/min
### Additional information:
<!-- If a simple dive log file can reproduce the issue, please attach that to the report. -->
<!-- With dive computer download issues consider adding Subsurface log file and Subsurface dumpfile to the report. -->
### Mentions:
<!-- Mention users that you want to review your issue with @<user-name>. Leave empty if not sure. -->
| True | Confused about SAC rate answer - <!-- Lines like this one are comments and will not be shown in the final output. -->
<!-- If you are a collaborator, please add labels and assign other collaborators for a review. -->
### Describe the issue:
<!-- Replace [ ] with [x] to select options. -->
- [x] Bug?
- [ ] Change request
- [ ] New feature request
- [ ] Discussion request
### Issue long description:
I've recently gotten a Perdix AI and imported my first dive and the SAC value doesn't match that of what I've seen from another application that I imported the same dive to... Based on the start and end pressure numbers from the equipment tab I get this calculation: 3130-1648 = 1482 PSI consumed; 1482/25minutes = 59.28PSI/min; Average Depth = 2ATA So 59.28/2 = 29.64PSI/min@1ATA; Working pressure = 3442PSI; 100/3442 = .02905 * 29.64 = 0.861cuft/min.
This value matches the result returned by the other application. Yet Subsurface returns 0.77cuft/min.
Also I find it strange that the starting and ending pressures in the graph are 3156 and 1644 as opposed to the 3130-1648 shown in the equipment tab.
### Operating system:
<!-- What OS are you running, including OS version and the language that you are running in -->
MacOS 10.13
<!-- What device are you using? -->
Apple MBP
<!-- Only answer this question if you have tried: Does the same happen on another OS? -->
Yes, shows same value in Linux, Windows, Android and MacOS.
### Subsurface version:
<!-- What version of Subsurface are you running? -->
4.7.8
<!-- Does the same happen on another Subsurface version? -->
<!-- Are you using official release, test build, or compiled yourself? -->
<!-- Provide Git hash if your are building Subsurface yourself. -->
### Steps to reproduce:
<!-- Provide reproduction steps separated with new lines - 1), 2), 3)... -->
### Current behavior:
<!-- What is the current behavior? -->
0.77cuft/min
### Expected behavior:
<!-- What is the expected behavior? -->
0.86cuft/min
### Additional information:
<!-- If a simple dive log file can reproduce the issue, please attach that to the report. -->
<!-- With dive computer download issues consider adding Subsurface log file and Subsurface dumpfile to the report. -->
### Mentions:
<!-- Mention users that you want to review your issue with @<user-name>. Leave empty if not sure. -->
| main | confused about sac rate answer describe the issue bug change request new feature request discussion request issue long description i ve recently gotten a perdix ai and imported my first dive and the sac value doesn t match that of what i ve seen from another application that i imported the same dive to based on the start and end pressure numbers from the equipment tab i get this calculation psi consumed min average depth so min working pressure min this value matches the result returned by the other application yet subsurface returns min also i find it strange that the starting and ending pressures in the graph are and as opposed to the shown in the equipment tab operating system macos apple mbp yes shows same value in linux windows android and macos subsurface version steps to reproduce current behavior min expected behavior min additional information mentions leave empty if not sure | 1 |
4,556 | 23,726,783,832 | IssuesEvent | 2022-08-30 20:25:42 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | API gateway debugger attach refused | stage/needs-investigation area/cdk maintainer/need-followup | <!-- Make sure we don't have an existing Issue that reports the bug you are seeing (both open and closed).
If you do find an existing Issue, re-open or add a comment to that Issue instead of creating a new one. -->
### Description:
I have defined lambda functions in cdk(python) stack using the _lambda.DockerImageFunction construct. This lambda function is fronted by an api-gateway.
I'm able to invoke this function locally via the API gateway. However, when I try this in debug model I get an error saying:
"Visual Studio Code
connect ECONNREFUSED 127.0.0.1:5858 "
### Steps to reproduce:
1/ clone this public repo: https://github.com/johanbjorn/helloworld
2/ cd helloworld
3/ code . (to start vscode)
4/ python3 -m venv .venv (from bash terminal in vscode, from {workspace} folder)
5/ source .venv/bin/activate
6/ python3 -m pip install -r requirements.txt
7/ sam-beta-cdk build --use-container
8/ sam-beta-cdk local start-api --debug-port 5858
9/ optioal non-debug call to api: curl http://127.0.0.1:3000/hello
10/ start debugging: debug icon leftmost vertical menu ->select 'Debug(via API GW) hello world from dropdown
11/ click play icon
12/ error message
### Observed result:
vscode diaglog box saying:
"Visual Studio Code
connect ECONNREFUSED 127.0.0.1:5858 "
### Expected result:
Debugger should attach to process and start debugging session
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Ubuntu: 20.04 via WSL 2
2. `sam --version`: sam-beta-cdk --version : SAM CLI, version 1.29.0.dev202108311500
3. AWS region: eu-central-1
VSCode: 1.61.2
Python: 3.8.10
node -v: v16.3.0
npm -v: 8.1.0
Docker Desktop 4.1.1 (69879)
[System.Environment]::OSVersion.Version
Major Minor Build Revision
10 0 19042 0
| True | API gateway debugger attach refused - <!-- Make sure we don't have an existing Issue that reports the bug you are seeing (both open and closed).
If you do find an existing Issue, re-open or add a comment to that Issue instead of creating a new one. -->
### Description:
I have defined lambda functions in cdk(python) stack using the _lambda.DockerImageFunction construct. This lambda function is fronted by an api-gateway.
I'm able to invoke this function locally via the API gateway. However, when I try this in debug model I get an error saying:
"Visual Studio Code
connect ECONNREFUSED 127.0.0.1:5858 "
### Steps to reproduce:
1/ clone this public repo: https://github.com/johanbjorn/helloworld
2/ cd helloworld
3/ code . (to start vscode)
4/ python3 -m venv .venv (from bash terminal in vscode, from {workspace} folder)
5/ source .venv/bin/activate
6/ python3 -m pip install -r requirements.txt
7/ sam-beta-cdk build --use-container
8/ sam-beta-cdk local start-api --debug-port 5858
9/ optioal non-debug call to api: curl http://127.0.0.1:3000/hello
10/ start debugging: debug icon leftmost vertical menu ->select 'Debug(via API GW) hello world from dropdown
11/ click play icon
12/ error message
### Observed result:
vscode diaglog box saying:
"Visual Studio Code
connect ECONNREFUSED 127.0.0.1:5858 "
### Expected result:
Debugger should attach to process and start debugging session
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Ubuntu: 20.04 via WSL 2
2. `sam --version`: sam-beta-cdk --version : SAM CLI, version 1.29.0.dev202108311500
3. AWS region: eu-central-1
VSCode: 1.61.2
Python: 3.8.10
node -v: v16.3.0
npm -v: 8.1.0
Docker Desktop 4.1.1 (69879)
[System.Environment]::OSVersion.Version
Major Minor Build Revision
10 0 19042 0
| main | api gateway debugger attach refused make sure we don t have an existing issue that reports the bug you are seeing both open and closed if you do find an existing issue re open or add a comment to that issue instead of creating a new one description i have defined lambda functions in cdk python stack using the lambda dockerimagefunction construct this lambda function is fronted by an api gateway i m able to invoke this function locally via the api gateway however when i try this in debug model i get an error saying visual studio code connect econnrefused steps to reproduce clone this public repo cd helloworld code to start vscode m venv venv from bash terminal in vscode from workspace folder source venv bin activate m pip install r requirements txt sam beta cdk build use container sam beta cdk local start api debug port optioal non debug call to api curl start debugging debug icon leftmost vertical menu select debug via api gw hello world from dropdown click play icon error message observed result vscode diaglog box saying visual studio code connect econnrefused expected result debugger should attach to process and start debugging session additional environment details ex windows mac amazon linux etc os ubuntu via wsl sam version sam beta cdk version sam cli version aws region eu central vscode python node v npm v docker desktop osversion version major minor build revision | 1 |
398,161 | 11,738,539,213 | IssuesEvent | 2020-03-11 16:13:55 | MilyMilo/educat | https://api.github.com/repos/MilyMilo/educat | closed | Fix users list width | good first issue priority: low scope: front-end type: bug | Tabela powinna być szeroka na całą szerokość containera. Obecnie wygląda tak:

| 1.0 | Fix users list width - Tabela powinna być szeroka na całą szerokość containera. Obecnie wygląda tak:

| non_main | fix users list width tabela powinna być szeroka na całą szerokość containera obecnie wygląda tak | 0 |
3,691 | 15,057,754,820 | IssuesEvent | 2021-02-03 22:12:18 | IITIDIDX597/sp_2021_team2 | https://api.github.com/repos/IITIDIDX597/sp_2021_team2 | opened | Supplier logistics | Maintaining Inventory Story | As a supplier
I want one daily (frequency tbd) delivery to customer
So that I am not wasting resources by shipping several individual items in same time frame. | True | Supplier logistics - As a supplier
I want one daily (frequency tbd) delivery to customer
So that I am not wasting resources by shipping several individual items in same time frame. | main | supplier logistics as a supplier i want one daily frequency tbd delivery to customer so that i am not wasting resources by shipping several individual items in same time frame | 1 |
5,849 | 31,154,081,575 | IssuesEvent | 2023-08-16 12:02:29 | centerofci/mathesar | https://api.github.com/repos/centerofci/mathesar | opened | Release 0.1.3 | restricted: maintainers status: started work: release | ## 2023-08-16
```[tasklist]
### Tasks
- [ ] Cut 0.1.3 release branch, freeze code
- [ ] Update version number in all places in the new branch
- [ ] Make an image from the branch with tag `0.1.3`, push to Dockerhub
- [ ] Test installation with the new image
- [ ] Test upgrade
```
| True | Release 0.1.3 - ## 2023-08-16
```[tasklist]
### Tasks
- [ ] Cut 0.1.3 release branch, freeze code
- [ ] Update version number in all places in the new branch
- [ ] Make an image from the branch with tag `0.1.3`, push to Dockerhub
- [ ] Test installation with the new image
- [ ] Test upgrade
```
| main | release tasks cut release branch freeze code update version number in all places in the new branch make an image from the branch with tag push to dockerhub test installation with the new image test upgrade | 1 |
442,819 | 12,751,147,221 | IssuesEvent | 2020-06-27 09:13:08 | IlchCMS/Ilch-2.0 | https://api.github.com/repos/IlchCMS/Ilch-2.0 | closed | Ereignisse: "Cannot declare class" | Priority: High Status: Completed Type: Bug | http://www.ilch.de/forum-showposts-53996-p1.html#391916
> Cannot declare class Modules\Article\Config\Config, because the name is already in use in Ilch-v2.1.2\application\modules\article\config\config.php on line 132
1. Ilch 2.1.2 installiert
2. DiscordNotifier installiert und konfiguriert
3. Update auf 2.1.3 versucht (Fehler).
http://www.ilch.de/forum-showposts-54176-p1.html#391879
https://github.com/IlchCMS/Ilch-2.0/blob/v2.1.4/application/libraries/Ilch/Transfer.php#L377 | 1.0 | Ereignisse: "Cannot declare class" - http://www.ilch.de/forum-showposts-53996-p1.html#391916
> Cannot declare class Modules\Article\Config\Config, because the name is already in use in Ilch-v2.1.2\application\modules\article\config\config.php on line 132
1. Ilch 2.1.2 installiert
2. DiscordNotifier installiert und konfiguriert
3. Update auf 2.1.3 versucht (Fehler).
http://www.ilch.de/forum-showposts-54176-p1.html#391879
https://github.com/IlchCMS/Ilch-2.0/blob/v2.1.4/application/libraries/Ilch/Transfer.php#L377 | non_main | ereignisse cannot declare class cannot declare class modules article config config because the name is already in use in ilch application modules article config config php on line ilch installiert discordnotifier installiert und konfiguriert update auf versucht fehler | 0 |
873 | 2,651,336,317 | IssuesEvent | 2015-03-16 10:42:06 | ivankravets/platformio | https://api.github.com/repos/ivankravets/platformio | closed | Serial doesn't work for mbed framework | bug builder | Example
```c
#include "mbed.h"
Serial pc(USBTX, USBRX); // tx, rx
int main() {
pc.printf("Hello World!\n\r");
while(1) {
pc.putc(pc.getc() + 1); // echo input back to terminal
}
}
```
```shell
(develop)➜ mbed-blink git:(develop) ✗ platformio run -e nucleo_f401re -t upload
[Fri Mar 13 23:04:44 2015] Processing nucleo_f401re (platform: ststm32, board: nucleo_f401re, framework: mbed)
----------------------------------------------------------------------------------------------------------------------------------------------------------------
arm-none-eabi-g++ -o .pioenvs/nucleo_f401re/src/main.o -c -std=gnu++98 -fno-rtti -fdata-sections -ffunction-sections -Wno-unused-parameter -fno-exceptions -Wextra -fno-delete-null-pointer-checks -fmessage-length=0 -mthumb -Wno-missing-field-initializers -c -fno-builtin -O2 -mfpu=fpv4-sp-d16 -fomit-frame-pointer -Wall -mfloat-abi=softfp -MMD -mcpu=cortex-m4 -DTARGET_NUCLEO_F401RE -DTARGET_M4 -DTARGET_CORTEX_M -DTARGET_STM -DTARGET_STM32F4 -DTARGET_STM32F401RE -DTOOLCHAIN_GCC_ARM -DTOOLCHAIN_GCC -D__CORTEX_M4 -DARM_MATH_CM4 -D__FPU_PRESENT=1 -DMBED_BUILD_TIMESTAMP=1426262453.27 -D__MBED__=1 -DTARGET_FF_ARDUINO -DTARGET_FF_MORPHO -DPLATFORMIO=010200 -I.pioenvs/nucleo_f401re/FrameworkMbedInc248832578 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-213564679 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-250686161 -I.pioenvs/nucleo_f401re/FrameworkMbedInc1266984661 -I.pioenvs/nucleo_f401re/FrameworkMbedInc2097404638 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-1116590620 -I.pioenvs/nucleo_f401re/FrameworkMbedInc170856846 -I.pioenvs/nucleo_f401re/FrameworkMbedInc175333816 .pioenvs/nucleo_f401re/src/main.cpp
arm-none-eabi-ar rcs .pioenvs/nucleo_f401re/libFrameworkMbed.a @/var/folders/nj/nt38kxwd2xxfl5l42np_w7qw0000gn/T/pioarargs-f8d149b23e80c5a2dfe687bd465da31f
arm-none-eabi-ranlib .pioenvs/nucleo_f401re/libFrameworkMbed.a
arm-none-eabi-g++ -o .pioenvs/nucleo_f401re/firmware.elf -Wl,--gc-sections -Wl,--wrap,main -mcpu=cortex-m4 -mthumb -mfpu=fpv4-sp-d16 -mfloat-abi=softfp --specs=nano.specs -T /Users/ikravets/.platformio/packages/framework-mbed/variant/NUCLEO_F401RE/mbed/TARGET_NUCLEO_F401RE/TOOLCHAIN_GCC_ARM/STM32F401XE.ld .pioenvs/nucleo_f401re/src/main.o -L/Users/ikravets/.platformio/packages/framework-mbed/variant/NUCLEO_F401RE/mbed/TARGET_NUCLEO_F401RE/TOOLCHAIN_GCC_ARM -L.pioenvs/nucleo_f401re -Wl,--start-group -lc -lgcc -lm -lmbed -lstdc++ -lsupc++ -lnosys .pioenvs/nucleo_f401re/libFrameworkMbed.a -Wl,--end-group
.pioenvs/nucleo_f401re/libFrameworkMbed.a(retarget.o): In function `__gnu_cxx::__verbose_terminate_handler()':
retarget.cpp:(.text._ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x0): multiple definition of `__gnu_cxx::__verbose_terminate_handler()'
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libstdc++_s.a(vterminate.o):vterminate.cc:(.text._ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x0): first defined here
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libc_s.a(lib_a-signalr.o): In function `_kill_r':
signalr.c:(.text._kill_r+0xe): undefined reference to `_kill'
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libc_s.a(lib_a-signalr.o): In function `_getpid_r':
signalr.c:(.text._getpid_r+0x0): undefined reference to `_getpid'
collect2: error: ld returned 1 exit status
scons: *** [.pioenvs/nucleo_f401re/firmware.elf] Error 1
================================================================= [ ERROR ] Took 1.35 seconds =================================================================
``` | 1.0 | Serial doesn't work for mbed framework - Example
```c
#include "mbed.h"
Serial pc(USBTX, USBRX); // tx, rx
int main() {
pc.printf("Hello World!\n\r");
while(1) {
pc.putc(pc.getc() + 1); // echo input back to terminal
}
}
```
```shell
(develop)➜ mbed-blink git:(develop) ✗ platformio run -e nucleo_f401re -t upload
[Fri Mar 13 23:04:44 2015] Processing nucleo_f401re (platform: ststm32, board: nucleo_f401re, framework: mbed)
----------------------------------------------------------------------------------------------------------------------------------------------------------------
arm-none-eabi-g++ -o .pioenvs/nucleo_f401re/src/main.o -c -std=gnu++98 -fno-rtti -fdata-sections -ffunction-sections -Wno-unused-parameter -fno-exceptions -Wextra -fno-delete-null-pointer-checks -fmessage-length=0 -mthumb -Wno-missing-field-initializers -c -fno-builtin -O2 -mfpu=fpv4-sp-d16 -fomit-frame-pointer -Wall -mfloat-abi=softfp -MMD -mcpu=cortex-m4 -DTARGET_NUCLEO_F401RE -DTARGET_M4 -DTARGET_CORTEX_M -DTARGET_STM -DTARGET_STM32F4 -DTARGET_STM32F401RE -DTOOLCHAIN_GCC_ARM -DTOOLCHAIN_GCC -D__CORTEX_M4 -DARM_MATH_CM4 -D__FPU_PRESENT=1 -DMBED_BUILD_TIMESTAMP=1426262453.27 -D__MBED__=1 -DTARGET_FF_ARDUINO -DTARGET_FF_MORPHO -DPLATFORMIO=010200 -I.pioenvs/nucleo_f401re/FrameworkMbedInc248832578 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-213564679 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-250686161 -I.pioenvs/nucleo_f401re/FrameworkMbedInc1266984661 -I.pioenvs/nucleo_f401re/FrameworkMbedInc2097404638 -I.pioenvs/nucleo_f401re/FrameworkMbedInc-1116590620 -I.pioenvs/nucleo_f401re/FrameworkMbedInc170856846 -I.pioenvs/nucleo_f401re/FrameworkMbedInc175333816 .pioenvs/nucleo_f401re/src/main.cpp
arm-none-eabi-ar rcs .pioenvs/nucleo_f401re/libFrameworkMbed.a @/var/folders/nj/nt38kxwd2xxfl5l42np_w7qw0000gn/T/pioarargs-f8d149b23e80c5a2dfe687bd465da31f
arm-none-eabi-ranlib .pioenvs/nucleo_f401re/libFrameworkMbed.a
arm-none-eabi-g++ -o .pioenvs/nucleo_f401re/firmware.elf -Wl,--gc-sections -Wl,--wrap,main -mcpu=cortex-m4 -mthumb -mfpu=fpv4-sp-d16 -mfloat-abi=softfp --specs=nano.specs -T /Users/ikravets/.platformio/packages/framework-mbed/variant/NUCLEO_F401RE/mbed/TARGET_NUCLEO_F401RE/TOOLCHAIN_GCC_ARM/STM32F401XE.ld .pioenvs/nucleo_f401re/src/main.o -L/Users/ikravets/.platformio/packages/framework-mbed/variant/NUCLEO_F401RE/mbed/TARGET_NUCLEO_F401RE/TOOLCHAIN_GCC_ARM -L.pioenvs/nucleo_f401re -Wl,--start-group -lc -lgcc -lm -lmbed -lstdc++ -lsupc++ -lnosys .pioenvs/nucleo_f401re/libFrameworkMbed.a -Wl,--end-group
.pioenvs/nucleo_f401re/libFrameworkMbed.a(retarget.o): In function `__gnu_cxx::__verbose_terminate_handler()':
retarget.cpp:(.text._ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x0): multiple definition of `__gnu_cxx::__verbose_terminate_handler()'
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libstdc++_s.a(vterminate.o):vterminate.cc:(.text._ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x0): first defined here
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libc_s.a(lib_a-signalr.o): In function `_kill_r':
signalr.c:(.text._kill_r+0xe): undefined reference to `_kill'
/Users/ikravets/.platformio/packages/toolchain-gccarmnoneeabi/bin/../lib/gcc/arm-none-eabi/4.8.3/../../../../arm-none-eabi/lib/armv7e-m/softfp/libc_s.a(lib_a-signalr.o): In function `_getpid_r':
signalr.c:(.text._getpid_r+0x0): undefined reference to `_getpid'
collect2: error: ld returned 1 exit status
scons: *** [.pioenvs/nucleo_f401re/firmware.elf] Error 1
================================================================= [ ERROR ] Took 1.35 seconds =================================================================
``` | non_main | serial doesn t work for mbed framework example c include mbed h serial pc usbtx usbrx tx rx int main pc printf hello world n r while pc putc pc getc echo input back to terminal shell develop ➜ mbed blink git develop ✗ platformio run e nucleo t upload processing nucleo platform board nucleo framework mbed arm none eabi g o pioenvs nucleo src main o c std gnu fno rtti fdata sections ffunction sections wno unused parameter fno exceptions wextra fno delete null pointer checks fmessage length mthumb wno missing field initializers c fno builtin mfpu sp fomit frame pointer wall mfloat abi softfp mmd mcpu cortex dtarget nucleo dtarget dtarget cortex m dtarget stm dtarget dtarget dtoolchain gcc arm dtoolchain gcc d cortex darm math d fpu present dmbed build timestamp d mbed dtarget ff arduino dtarget ff morpho dplatformio i pioenvs nucleo i pioenvs nucleo frameworkmbedinc i pioenvs nucleo frameworkmbedinc i pioenvs nucleo i pioenvs nucleo i pioenvs nucleo frameworkmbedinc i pioenvs nucleo i pioenvs nucleo pioenvs nucleo src main cpp arm none eabi ar rcs pioenvs nucleo libframeworkmbed a var folders nj t pioarargs arm none eabi ranlib pioenvs nucleo libframeworkmbed a arm none eabi g o pioenvs nucleo firmware elf wl gc sections wl wrap main mcpu cortex mthumb mfpu sp mfloat abi softfp specs nano specs t users ikravets platformio packages framework mbed variant nucleo mbed target nucleo toolchain gcc arm ld pioenvs nucleo src main o l users ikravets platformio packages framework mbed variant nucleo mbed target nucleo toolchain gcc arm l pioenvs nucleo wl start group lc lgcc lm lmbed lstdc lsupc lnosys pioenvs nucleo libframeworkmbed a wl end group pioenvs nucleo libframeworkmbed a retarget o in function gnu cxx verbose terminate handler retarget cpp text gnu verbose terminate handlerev multiple definition of gnu cxx verbose terminate handler users ikravets platformio packages toolchain gccarmnoneeabi bin lib gcc arm none eabi arm none eabi lib m softfp libstdc s a vterminate o vterminate cc text gnu verbose terminate handlerev first defined here users ikravets platformio packages toolchain gccarmnoneeabi bin lib gcc arm none eabi arm none eabi lib m softfp libc s a lib a signalr o in function kill r signalr c text kill r undefined reference to kill users ikravets platformio packages toolchain gccarmnoneeabi bin lib gcc arm none eabi arm none eabi lib m softfp libc s a lib a signalr o in function getpid r signalr c text getpid r undefined reference to getpid error ld returned exit status scons error took seconds | 0 |
268,821 | 28,862,570,729 | IssuesEvent | 2023-05-05 00:22:35 | Carbonable/carbon-protocol | https://api.github.com/repos/Carbonable/carbon-protocol | reopened | Upgrade OZ dep to 0.6.0 including ERC-1155 | Type: Security Readiness: Ready for dev Complexity: Obvious Component: Smart Contract Epic: Badges | Blocked: ArgentX Account ID is not compatible, we are waiting for ArgentX answer (Discord) | True | Upgrade OZ dep to 0.6.0 including ERC-1155 - Blocked: ArgentX Account ID is not compatible, we are waiting for ArgentX answer (Discord) | non_main | upgrade oz dep to including erc blocked argentx account id is not compatible we are waiting for argentx answer discord | 0 |
170,883 | 6,473,846,967 | IssuesEvent | 2017-08-17 16:44:34 | tonypesdm/CAMS | https://api.github.com/repos/tonypesdm/CAMS | opened | CW 2.3.2: Data display/submission issues | dev: bug Priority 1 | - My Reported Faults/Track a Fault: disparity between the status of an issue in CAMS and that shown on the web to the customer. (Flagged 23/06/2017)
- Past activity record: showing the date that a volunteer work record was submitted to the system rather than the actual date the volunteer work was carried out. (Flagged 27/07/2017)
- Please check the activity log to see if it times out, then fails to submit data that the user has entered.
| 1.0 | CW 2.3.2: Data display/submission issues - - My Reported Faults/Track a Fault: disparity between the status of an issue in CAMS and that shown on the web to the customer. (Flagged 23/06/2017)
- Past activity record: showing the date that a volunteer work record was submitted to the system rather than the actual date the volunteer work was carried out. (Flagged 27/07/2017)
- Please check the activity log to see if it times out, then fails to submit data that the user has entered.
| non_main | cw data display submission issues my reported faults track a fault disparity between the status of an issue in cams and that shown on the web to the customer flagged past activity record showing the date that a volunteer work record was submitted to the system rather than the actual date the volunteer work was carried out flagged please check the activity log to see if it times out then fails to submit data that the user has entered | 0 |
96,188 | 12,102,607,746 | IssuesEvent | 2020-04-20 16:58:30 | MozillaFoundation/Design | https://api.github.com/repos/MozillaFoundation/Design | opened | Earth Day Instagram Story | design | ### Intro
Creating Instagram story for Earth Day 2020.
### Links
[Content is here](https://docs.google.com/document/d/1-m3FtBvtvJsbwkz6ntcFb5-gIkHAfjFyYpLTlsCBxM0/edit).
cc: @audreymofo
| 1.0 | Earth Day Instagram Story - ### Intro
Creating Instagram story for Earth Day 2020.
### Links
[Content is here](https://docs.google.com/document/d/1-m3FtBvtvJsbwkz6ntcFb5-gIkHAfjFyYpLTlsCBxM0/edit).
cc: @audreymofo
| non_main | earth day instagram story intro creating instagram story for earth day links cc audreymofo | 0 |
708,474 | 24,342,629,535 | IssuesEvent | 2022-10-01 22:35:04 | frickjack/littleware | https://api.github.com/repos/frickjack/littleware | closed | REST API implementation | Type-Enhancement Priority-Medium auto-migrated Milestone-0 | ```
Implement a REST version of the littleware client APIs
```
Original issue reported on code.google.com by `reuben.p...@gmail.com` on 2 Sep 2009 at 7:08
| 1.0 | REST API implementation - ```
Implement a REST version of the littleware client APIs
```
Original issue reported on code.google.com by `reuben.p...@gmail.com` on 2 Sep 2009 at 7:08
| non_main | rest api implementation implement a rest version of the littleware client apis original issue reported on code google com by reuben p gmail com on sep at | 0 |
203,461 | 15,369,586,819 | IssuesEvent | 2021-03-02 07:35:30 | lutraconsulting/input-manual-tests | https://api.github.com/repos/lutraconsulting/input-manual-tests | closed | Test Execution InputApp 0.8.0 (android) | test execution | ## Test plan for Input manual testing
| Test environment | Value |
|---|---|
| Input Version: | e479b4f9628fb71f1daf8b9937ac4db7562b55a9 arm64-v8a |
| Mergin Version: | |
| Mergin URL: <> | |
| QGIS Version: | |
| Mergin plugin Version: | |
| Mobile OS: | Android arm64-v8a |
| Date of Execution: | 03/02/2021 |
---
### Test Cases
- [x] ( #2 ) TC 01: Mergin & Projects Manipulation
G10: crash after creating anew feature
- [x] ( #3 ) TC 02: Sync & Project Status
C: can't sync projects created with wizard.
- [x] ( #4 ) TC 03: Map Canvas
- [x] ( #5 ) TC 04: Recording
- [x] ( #6 ) TC 05: Forms
A4: Value relation widget - can't set value other than the first from available values
- [x] ( #7 ) TC 06: Data Providers
missing projects fro tests D, F
- [x] ( #8 ) TC 07: Translations
My Android only allows for a single language setup. So multiple languages tests were skipped.
- [x] ( #18 ) TC 08: System Specifics
A: Not giving the permissions makes the app unresponsible
- [x] ( #19 ) TC 09: Welcome Screen & Project
A3: "Start here!" project on a fresh install users get missing proj datum shift grids
- [x] ( #24 ) TC 10: Proj Tests
A6 missing datum shift grid
---
| Test Execution Outcome | |
|---|---|
| Issues Created During Testing: | https://github.com/lutraconsulting/input/issues/1126 https://github.com/lutraconsulting/input/issues/1127 https://github.com/lutraconsulting/input/issues/1128 https://github.com/lutraconsulting/input/issues/1129 https://github.com/lutraconsulting/input/issues/1130 https://github.com/lutraconsulting/input/issues/1131 |
**Bugs Created** | 1.0 | Test Execution InputApp 0.8.0 (android) - ## Test plan for Input manual testing
| Test environment | Value |
|---|---|
| Input Version: | e479b4f9628fb71f1daf8b9937ac4db7562b55a9 arm64-v8a |
| Mergin Version: | |
| Mergin URL: <> | |
| QGIS Version: | |
| Mergin plugin Version: | |
| Mobile OS: | Android arm64-v8a |
| Date of Execution: | 03/02/2021 |
---
### Test Cases
- [x] ( #2 ) TC 01: Mergin & Projects Manipulation
G10: crash after creating anew feature
- [x] ( #3 ) TC 02: Sync & Project Status
C: can't sync projects created with wizard.
- [x] ( #4 ) TC 03: Map Canvas
- [x] ( #5 ) TC 04: Recording
- [x] ( #6 ) TC 05: Forms
A4: Value relation widget - can't set value other than the first from available values
- [x] ( #7 ) TC 06: Data Providers
missing projects fro tests D, F
- [x] ( #8 ) TC 07: Translations
My Android only allows for a single language setup. So multiple languages tests were skipped.
- [x] ( #18 ) TC 08: System Specifics
A: Not giving the permissions makes the app unresponsible
- [x] ( #19 ) TC 09: Welcome Screen & Project
A3: "Start here!" project on a fresh install users get missing proj datum shift grids
- [x] ( #24 ) TC 10: Proj Tests
A6 missing datum shift grid
---
| Test Execution Outcome | |
|---|---|
| Issues Created During Testing: | https://github.com/lutraconsulting/input/issues/1126 https://github.com/lutraconsulting/input/issues/1127 https://github.com/lutraconsulting/input/issues/1128 https://github.com/lutraconsulting/input/issues/1129 https://github.com/lutraconsulting/input/issues/1130 https://github.com/lutraconsulting/input/issues/1131 |
**Bugs Created** | non_main | test execution inputapp android test plan for input manual testing test environment value input version mergin version mergin url qgis version mergin plugin version mobile os android date of execution test cases tc mergin projects manipulation crash after creating anew feature tc sync project status c can t sync projects created with wizard tc map canvas tc recording tc forms value relation widget can t set value other than the first from available values tc data providers missing projects fro tests d f tc translations my android only allows for a single language setup so multiple languages tests were skipped tc system specifics a not giving the permissions makes the app unresponsible tc welcome screen project start here project on a fresh install users get missing proj datum shift grids tc proj tests missing datum shift grid test execution outcome issues created during testing bugs created | 0 |
2,591 | 8,815,246,404 | IssuesEvent | 2018-12-29 16:10:56 | dgets/nightMiner | https://api.github.com/repos/dgets/nightMiner | opened | Redundancy in init_early_blockade() and scan_for_enemy_shipyards() | enhancement maintainability | Go through these and clean up the shit a bit. | True | Redundancy in init_early_blockade() and scan_for_enemy_shipyards() - Go through these and clean up the shit a bit. | main | redundancy in init early blockade and scan for enemy shipyards go through these and clean up the shit a bit | 1 |
552 | 3,995,586,308 | IssuesEvent | 2016-05-10 15:56:57 | duckduckgo/zeroclickinfo-spice | https://api.github.com/repos/duckduckgo/zeroclickinfo-spice | closed | XKCD: Make Prev&Next buttons not open in a new window with 'New window' On. | Maintainer Input Requested Suggestion | It'd be nice if the Prev & Next buttons wouldn't open another query /in a new tab/ when the setting 'New window' is On (true), in order to make browsing of xkcd comics more comfortable. The Explain link should follow the current behaviour though.
------
IA Page: http://duck.co/ia/view/xkcd | True | XKCD: Make Prev&Next buttons not open in a new window with 'New window' On. - It'd be nice if the Prev & Next buttons wouldn't open another query /in a new tab/ when the setting 'New window' is On (true), in order to make browsing of xkcd comics more comfortable. The Explain link should follow the current behaviour though.
------
IA Page: http://duck.co/ia/view/xkcd | main | xkcd make prev next buttons not open in a new window with new window on it d be nice if the prev next buttons wouldn t open another query in a new tab when the setting new window is on true in order to make browsing of xkcd comics more comfortable the explain link should follow the current behaviour though ia page | 1 |
247 | 2,993,765,939 | IssuesEvent | 2015-07-22 07:15:58 | ACP3/cms | https://api.github.com/repos/ACP3/cms | closed | Automate git subtree pushes | Maintainability | Git subtree pushes do take a considerably long time, especially when we have to trigger when manually.
Therefore we have to develop some tools, so that subtree pushes can be done automatically.
| True | Automate git subtree pushes - Git subtree pushes do take a considerably long time, especially when we have to trigger when manually.
Therefore we have to develop some tools, so that subtree pushes can be done automatically.
| main | automate git subtree pushes git subtree pushes do take a considerably long time especially when we have to trigger when manually therefore we have to develop some tools so that subtree pushes can be done automatically | 1 |
335 | 3,144,889,501 | IssuesEvent | 2015-09-14 15:26:14 | USGS-EROS/espa | https://api.github.com/repos/USGS-EROS/espa | opened | Cleanup and simplify warp.py | enhancement Maintainability | The warping code base is too complex and needs to be simplified for future enhancements. | True | Cleanup and simplify warp.py - The warping code base is too complex and needs to be simplified for future enhancements. | main | cleanup and simplify warp py the warping code base is too complex and needs to be simplified for future enhancements | 1 |
396,273 | 27,109,899,545 | IssuesEvent | 2023-02-15 14:40:13 | sbt/sbt | https://api.github.com/repos/sbt/sbt | closed | Arrow keys do not work on Windows without -Dinput.encoding=Cp1252 | Bug documentation uncategorized | In Play we had some problems with sbt 0.13's command history on windows. This is because we set the `-Dfile.encoding=UTF-8` system property which interfered with JLine.
```
>set JAVA_OPTS=-Dfile.encoding=UTF-8
>sbt
[info] Set current project to project (in build file:/C:/p/ansi/ansi2/project/)
> �
```
The fix required reading a bit of JLine 2 source code, but was pretty easy: add another property `-Dinput.encoding=Cp1252`.
Here's the Play issue: https://github.com/playframework/playframework/pull/1620
@jsuereth suggested that this fix might not work properly in all countries.
| 1.0 | Arrow keys do not work on Windows without -Dinput.encoding=Cp1252 - In Play we had some problems with sbt 0.13's command history on windows. This is because we set the `-Dfile.encoding=UTF-8` system property which interfered with JLine.
```
>set JAVA_OPTS=-Dfile.encoding=UTF-8
>sbt
[info] Set current project to project (in build file:/C:/p/ansi/ansi2/project/)
> �
```
The fix required reading a bit of JLine 2 source code, but was pretty easy: add another property `-Dinput.encoding=Cp1252`.
Here's the Play issue: https://github.com/playframework/playframework/pull/1620
@jsuereth suggested that this fix might not work properly in all countries.
| non_main | arrow keys do not work on windows without dinput encoding in play we had some problems with sbt s command history on windows this is because we set the dfile encoding utf system property which interfered with jline set java opts dfile encoding utf sbt set current project to project in build file c p ansi project � the fix required reading a bit of jline source code but was pretty easy add another property dinput encoding here s the play issue jsuereth suggested that this fix might not work properly in all countries | 0 |
112,239 | 17,080,656,405 | IssuesEvent | 2021-07-08 04:21:26 | MohamedElashri/Zotero-Docker | https://api.github.com/repos/MohamedElashri/Zotero-Docker | closed | CVE-2019-10744 (High) detected in multiple libraries | security vulnerability | ## CVE-2019-10744 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.11.1.tgz</b>, <b>lodash-4.12.0.tgz</b>, <b>lodash-3.10.1.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.11.1.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.11.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.11.1.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **lodash-4.11.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.12.0.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.12.0.tgz">https://registry.npmjs.org/lodash/-/lodash-4.12.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/firefox-profile/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- firefox-profile-0.4.0.tgz
- :x: **lodash-4.12.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/fx-runner/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- fx-runner-1.0.5.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/6f83bd2bfd0767b41690936c81cfaee93c63820e">6f83bd2bfd0767b41690936c81cfaee93c63820e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of lodash lower than 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10744>CVE-2019-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-jf85-cpcp-j695">https://github.com/advisories/GHSA-jf85-cpcp-j695</a></p>
<p>Release Date: 2019-07-08</p>
<p>Fix Resolution: lodash-4.17.12, lodash-amd-4.17.12, lodash-es-4.17.12, lodash.defaultsdeep-4.6.1, lodash.merge- 4.6.2, lodash.mergewith-4.6.2, lodash.template-4.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-10744 (High) detected in multiple libraries - ## CVE-2019-10744 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.11.1.tgz</b>, <b>lodash-4.12.0.tgz</b>, <b>lodash-3.10.1.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.11.1.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.11.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.11.1.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **lodash-4.11.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.12.0.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.12.0.tgz">https://registry.npmjs.org/lodash/-/lodash-4.12.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/firefox-profile/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- firefox-profile-0.4.0.tgz
- :x: **lodash-4.12.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/fx-runner/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- fx-runner-1.0.5.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/6f83bd2bfd0767b41690936c81cfaee93c63820e">6f83bd2bfd0767b41690936c81cfaee93c63820e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of lodash lower than 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10744>CVE-2019-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-jf85-cpcp-j695">https://github.com/advisories/GHSA-jf85-cpcp-j695</a></p>
<p>Release Date: 2019-07-08</p>
<p>Fix Resolution: lodash-4.17.12, lodash-amd-4.17.12, lodash-es-4.17.12, lodash.defaultsdeep-4.6.1, lodash.merge- 4.6.2, lodash.mergewith-4.6.2, lodash.template-4.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash tgz lodash modular utilities library home page a href path to dependency file zotero docker client zotero build xpi package json path to vulnerable library zotero docker client zotero build xpi node modules lodash package json dependency hierarchy jpm tgz root library x lodash tgz vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file zotero docker client zotero build xpi package json path to vulnerable library zotero docker client zotero build xpi node modules firefox profile node modules lodash package json dependency hierarchy jpm tgz root library firefox profile tgz x lodash tgz vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file zotero docker client zotero build xpi package json path to vulnerable library zotero docker client zotero build xpi node modules fx runner node modules lodash package json dependency hierarchy jpm tgz root library fx runner tgz x lodash tgz vulnerable library found in head commit a href found in base branch main vulnerability details versions of lodash lower than are vulnerable to prototype pollution the function defaultsdeep could be tricked into adding or modifying properties of object prototype using a constructor payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash lodash amd lodash es lodash defaultsdeep lodash merge lodash mergewith lodash template step up your open source security game with whitesource | 0 |
5,156 | 26,265,830,529 | IssuesEvent | 2023-01-06 12:26:55 | kkkkan/CsvToQrConverter | https://api.github.com/repos/kkkkan/CsvToQrConverter | closed | [ファイル形式チェンジャー] 複数ファイルアップロード対応をする | enhancement maintainer's-memo | それに伴い「ダウンロードされない場合は、ブラウザの設定で「複数ファイルのダウンロード」を許可してください。」の色と場所も変えた方がよさそう。
多分これでどうにかなるはず。
https://www.wabiapp.com/WabiSampleSource/html5/input_file_multi_select.html
https://rennnosukesann.hatenablog.com/entry/2018/05/01/235631 | True | [ファイル形式チェンジャー] 複数ファイルアップロード対応をする - それに伴い「ダウンロードされない場合は、ブラウザの設定で「複数ファイルのダウンロード」を許可してください。」の色と場所も変えた方がよさそう。
多分これでどうにかなるはず。
https://www.wabiapp.com/WabiSampleSource/html5/input_file_multi_select.html
https://rennnosukesann.hatenablog.com/entry/2018/05/01/235631 | main | 複数ファイルアップロード対応をする それに伴い「ダウンロードされない場合は、ブラウザの設定で「複数ファイルのダウンロード」を許可してください。」の色と場所も変えた方がよさそう。 多分これでどうにかなるはず。 | 1 |
5,130 | 26,143,196,966 | IssuesEvent | 2022-12-29 22:09:10 | Homebrew/homebrew-cask | https://api.github.com/repos/Homebrew/homebrew-cask | closed | “Nova.app” is damaged and can’t be opened | awaiting maintainer feedback | ### Verification
- [X] I understand that [if I ignore these instructions, my issue may be closed without review](https://github.com/Homebrew/homebrew-cask/blob/master/doc/faq/closing_issues_without_review.md).
- [X] I have retried my command with `--force`.
- [X] I ran `brew update-reset && brew update` and retried my command.
- [X] I ran `brew doctor`, fixed as many issues as possible and retried my command.
- [X] I have checked the instructions for [reporting bugs](https://github.com/Homebrew/homebrew-cask#reporting-bugs).
- [X] I made doubly sure this is not a [checksum does not match](https://docs.brew.sh/Common-Issues#cask-checksum-does-not-match) error.
### Description of issue
(ref #134265 - copied description and some info from that issue since it was closed and locked, `brew doctor` and `brew tap` output are from my environment)
Freshly installed Nova fails with the app-is-broken dialog below, which according to the help opened on clicking the ? icon states the app doesn't match it's code signature. Uninstalling, deleting the download from cache and reinstalling results in same again.

I ran `codesign --verify --verbose ~/Downloads/Nova.app` for the version downloaded manually from the Nova website:
```
/Users/cweagans/Downloads/Nova.app: valid on disk
/Users/cweagans/Downloads/Nova.app: satisfies its Designated Requirement
```
Here's what it looks like for the Cask-installed version (`codesign --verify --verbose /usr/local/Caskroom/nova/10.6/Nova.app`):
```
/usr/local/Caskroom/nova/10.6/Nova.app: a sealed resource is missing or invalid
file modified: /Applications/Nova.app/Contents/Resources/nova_completions.txt
```
That appears to be a file referenced in the formula, but I'm not sure exactly what that line does. This appears to be the issue though:
```
cweagans@nibbler> ls -alh /Applications/Nova.app/Contents/Resources/nova_completions.txt
lrwxr-xr-x 1 cweagans admin 41B Dec 19 13:41 /Applications/Nova.app/Contents/Resources/nova_completions.txt -> /usr/local/share/zsh/site-functions/_nova
cweagans@nibbler> ls -alh ~/Downloads/Nova.app/Contents/Resources/nova_completions.txt
-rw-r--r--@ 1 cweagans staff 606B Dec 14 21:03 /Users/cweagans/Downloads/Nova.app/Contents/Resources/nova_completions.txt
```
Fixing the permissions on the symlink manually didn't seem to help. Perhaps it would be better to change the direction of the link so that `/usr/local/share/zsh/site-functions/_nova` is a symlink to `/Applications/Nova.app/Contents/Resources/nova_completions.txt`? Or maybe @logancollins has some idea (apologies for the direct ping - you were the only one on the Panic org who identified themselves as a Nova engineer)?
### Command that failed
`nova` or `open -a Nova`
### Output of command with `--verbose --debug`
```shell
Nova has no verbose or debug flags.
```
### Output of `brew doctor --verbose`
```shell
==> Cask Environment Variables:
BUNDLE_PATH
CHRUBY_VERSION
GEM_HOME
GEM_PATH
HOMEBREW_CASK_OPTS
LC_ALL
PATH
RBENV_VERSION
RUBYLIB
RUBYOPT
RUBYPATH
SHELL
==> $LOAD_PATHS
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/warning-1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/tapioca-0.7.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/yard-sorbet-0.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/yard-0.9.28/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/spoom-1.1.11/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/thor-1.2.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-static-and-runtime-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-static-0.5.10461-universal-darwin-21/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-cobertura-2.1.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-0.21.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov_json_formatter-0.1.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-html-0.12.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ruby-macho-3.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-sorbet-0.6.11/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-rspec-2.15.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-rails-2.17.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-performance-1.15.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-1.35.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unicode-display_width-2.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ruby-progressbar-1.11.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-ast-1.24.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec_junit_formatter-0.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-wait-0.0.9/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-sorbet-1.9.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-retry-0.6.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-its-1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-github-2.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-mocks-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-expectations-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-core-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-support-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ronn-0.7.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rexml-3.2.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rdiscount-2.2.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/rdiscount-2.2.7
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rbi-0.0.14/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unparser-0.6.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rack-3.0.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/plist-3.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/patchelf-1.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parlour-8.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-runtime-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rainbow-3.1.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parser-3.1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parallel_tests-3.13.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parallel-1.22.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mustache-1.1.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mechanize-2.8.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/webrobots-0.1.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/webrick-1.7.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubyntlm-0.6.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/nokogiri-1.13.10-x86_64-darwin/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/racc-1.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/racc-1.6.1
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/net-http-persistent-4.0.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/net-http-digest_auth-1.4.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mime-types-3.4.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mime-types-data-3.2022.0105/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/json_schemer-0.2.24/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/uri_template-0.7.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/json-2.6.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/json-2.6.3
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/http-cookie-1.0.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/hpricot-0.8.6/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/hpricot-0.8.6
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/hana-1.3.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/elftools-1.2.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ecma-re-validator-0.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/regexp_parser-2.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/domain_name-0.5.20190701/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unf-0.1.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unf_ext-0.0.8.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/unf_ext-0.0.8.2
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/docile-1.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/diff-lcs-1.5.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/did_you_mean-1.6.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/commander-4.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/highline-2.0.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/coderay-1.1.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/byebug-11.1.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/byebug-11.1.3
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/bootsnap-1.15.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/bootsnap-1.15.0
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/msgpack-1.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/msgpack-1.6.0
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/bindata-2.4.14/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ast-2.4.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/addressable-2.8.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/public_suffix-5.0.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/activesupport-6.1.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/zeitwerk-2.6.6/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/tzinfo-2.0.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/minitest-5.16.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/i18n-1.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/concurrent-ruby-1.1.10/lib/concurrent-ruby
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0/x86_64-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0/universal-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0/x86_64-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0/universal-darwin22
/usr/local/Homebrew/Library/Homebrew
==> Homebrew Version
3.6.15
==> macOS
13.1
==> SIP
Enabled
==> Homebrew Cask Staging Location
/usr/local/Caskroom
==> Homebrew Cask Taps:
/usr/local/Homebrew/Library/Taps/homebrew/homebrew-cask (4136 casks)
Your system is ready to brew.
```
### Output of `brew tap`
```shell
homebrew/bundle
homebrew/cask
homebrew/core
```
| True | “Nova.app” is damaged and can’t be opened - ### Verification
- [X] I understand that [if I ignore these instructions, my issue may be closed without review](https://github.com/Homebrew/homebrew-cask/blob/master/doc/faq/closing_issues_without_review.md).
- [X] I have retried my command with `--force`.
- [X] I ran `brew update-reset && brew update` and retried my command.
- [X] I ran `brew doctor`, fixed as many issues as possible and retried my command.
- [X] I have checked the instructions for [reporting bugs](https://github.com/Homebrew/homebrew-cask#reporting-bugs).
- [X] I made doubly sure this is not a [checksum does not match](https://docs.brew.sh/Common-Issues#cask-checksum-does-not-match) error.
### Description of issue
(ref #134265 - copied description and some info from that issue since it was closed and locked, `brew doctor` and `brew tap` output are from my environment)
Freshly installed Nova fails with the app-is-broken dialog below, which according to the help opened on clicking the ? icon states the app doesn't match it's code signature. Uninstalling, deleting the download from cache and reinstalling results in same again.

I ran `codesign --verify --verbose ~/Downloads/Nova.app` for the version downloaded manually from the Nova website:
```
/Users/cweagans/Downloads/Nova.app: valid on disk
/Users/cweagans/Downloads/Nova.app: satisfies its Designated Requirement
```
Here's what it looks like for the Cask-installed version (`codesign --verify --verbose /usr/local/Caskroom/nova/10.6/Nova.app`):
```
/usr/local/Caskroom/nova/10.6/Nova.app: a sealed resource is missing or invalid
file modified: /Applications/Nova.app/Contents/Resources/nova_completions.txt
```
That appears to be a file referenced in the formula, but I'm not sure exactly what that line does. This appears to be the issue though:
```
cweagans@nibbler> ls -alh /Applications/Nova.app/Contents/Resources/nova_completions.txt
lrwxr-xr-x 1 cweagans admin 41B Dec 19 13:41 /Applications/Nova.app/Contents/Resources/nova_completions.txt -> /usr/local/share/zsh/site-functions/_nova
cweagans@nibbler> ls -alh ~/Downloads/Nova.app/Contents/Resources/nova_completions.txt
-rw-r--r--@ 1 cweagans staff 606B Dec 14 21:03 /Users/cweagans/Downloads/Nova.app/Contents/Resources/nova_completions.txt
```
Fixing the permissions on the symlink manually didn't seem to help. Perhaps it would be better to change the direction of the link so that `/usr/local/share/zsh/site-functions/_nova` is a symlink to `/Applications/Nova.app/Contents/Resources/nova_completions.txt`? Or maybe @logancollins has some idea (apologies for the direct ping - you were the only one on the Panic org who identified themselves as a Nova engineer)?
### Command that failed
`nova` or `open -a Nova`
### Output of command with `--verbose --debug`
```shell
Nova has no verbose or debug flags.
```
### Output of `brew doctor --verbose`
```shell
==> Cask Environment Variables:
BUNDLE_PATH
CHRUBY_VERSION
GEM_HOME
GEM_PATH
HOMEBREW_CASK_OPTS
LC_ALL
PATH
RBENV_VERSION
RUBYLIB
RUBYOPT
RUBYPATH
SHELL
==> $LOAD_PATHS
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/warning-1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/tapioca-0.7.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/yard-sorbet-0.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/yard-0.9.28/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/spoom-1.1.11/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/thor-1.2.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-static-and-runtime-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-static-0.5.10461-universal-darwin-21/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-cobertura-2.1.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-0.21.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov_json_formatter-0.1.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/simplecov-html-0.12.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ruby-macho-3.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-sorbet-0.6.11/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-rspec-2.15.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-rails-2.17.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-performance-1.15.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-1.35.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unicode-display_width-2.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ruby-progressbar-1.11.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubocop-ast-1.24.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec_junit_formatter-0.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-wait-0.0.9/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-sorbet-1.9.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-retry-0.6.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-its-1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-github-2.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-mocks-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-expectations-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-core-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rspec-support-3.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ronn-0.7.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rexml-3.2.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rdiscount-2.2.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/rdiscount-2.2.7
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rbi-0.0.14/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unparser-0.6.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rack-3.0.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/pry-0.14.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/plist-3.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/patchelf-1.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parlour-8.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/sorbet-runtime-0.5.10461/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rainbow-3.1.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parser-3.1.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parallel_tests-3.13.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/parallel-1.22.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mustache-1.1.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/method_source-1.0.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mechanize-2.8.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/webrobots-0.1.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/webrick-1.7.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/rubyntlm-0.6.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/nokogiri-1.13.10-x86_64-darwin/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/racc-1.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/racc-1.6.1
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/net-http-persistent-4.0.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/net-http-digest_auth-1.4.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mime-types-3.4.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/mime-types-data-3.2022.0105/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/json_schemer-0.2.24/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/uri_template-0.7.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/json-2.6.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/json-2.6.3
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/http-cookie-1.0.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/hpricot-0.8.6/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/hpricot-0.8.6
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/hana-1.3.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/elftools-1.2.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ecma-re-validator-0.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/regexp_parser-2.6.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/domain_name-0.5.20190701/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unf-0.1.4/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/unf_ext-0.0.8.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/unf_ext-0.0.8.2
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/docile-1.4.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/diff-lcs-1.5.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/did_you_mean-1.6.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/connection_pool-2.3.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/commander-4.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/highline-2.0.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/coderay-1.1.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/byebug-11.1.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/byebug-11.1.3
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/bootsnap-1.15.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/bootsnap-1.15.0
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/msgpack-1.6.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/extensions/universal-darwin-21/2.6.0/msgpack-1.6.0
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/bindata-2.4.14/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/ast-2.4.2/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/addressable-2.8.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/public_suffix-5.0.1/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/activesupport-6.1.7/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/zeitwerk-2.6.6/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/tzinfo-2.0.5/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/minitest-5.16.3/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/i18n-1.12.0/lib
/usr/local/Homebrew/Library/Homebrew/vendor/bundle/ruby/2.6.0/gems/concurrent-ruby-1.1.10/lib/concurrent-ruby
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0/x86_64-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby/2.6.0/universal-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/vendor_ruby
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0/x86_64-darwin22
/System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/lib/ruby/2.6.0/universal-darwin22
/usr/local/Homebrew/Library/Homebrew
==> Homebrew Version
3.6.15
==> macOS
13.1
==> SIP
Enabled
==> Homebrew Cask Staging Location
/usr/local/Caskroom
==> Homebrew Cask Taps:
/usr/local/Homebrew/Library/Taps/homebrew/homebrew-cask (4136 casks)
Your system is ready to brew.
```
### Output of `brew tap`
```shell
homebrew/bundle
homebrew/cask
homebrew/core
```
| main | “nova app” is damaged and can’t be opened verification i understand that i have retried my command with force i ran brew update reset brew update and retried my command i ran brew doctor fixed as many issues as possible and retried my command i have checked the instructions for i made doubly sure this is not a error description of issue ref copied description and some info from that issue since it was closed and locked brew doctor and brew tap output are from my environment freshly installed nova fails with the app is broken dialog below which according to the help opened on clicking the icon states the app doesn t match it s code signature uninstalling deleting the download from cache and reinstalling results in same again i ran codesign verify verbose downloads nova app for the version downloaded manually from the nova website users cweagans downloads nova app valid on disk users cweagans downloads nova app satisfies its designated requirement here s what it looks like for the cask installed version codesign verify verbose usr local caskroom nova nova app usr local caskroom nova nova app a sealed resource is missing or invalid file modified applications nova app contents resources nova completions txt that appears to be a file referenced in the formula but i m not sure exactly what that line does this appears to be the issue though cweagans nibbler ls alh applications nova app contents resources nova completions txt lrwxr xr x cweagans admin dec applications nova app contents resources nova completions txt usr local share zsh site functions nova cweagans nibbler ls alh downloads nova app contents resources nova completions txt rw r r cweagans staff dec users cweagans downloads nova app contents resources nova completions txt fixing the permissions on the symlink manually didn t seem to help perhaps it would be better to change the direction of the link so that usr local share zsh site functions nova is a symlink to applications nova app contents resources nova completions txt or maybe logancollins has some idea apologies for the direct ping you were the only one on the panic org who identified themselves as a nova engineer command that failed nova or open a nova output of command with verbose debug shell nova has no verbose or debug flags output of brew doctor verbose shell cask environment variables bundle path chruby version gem home gem path homebrew cask opts lc all path rbenv version rubylib rubyopt rubypath shell load paths usr local homebrew library homebrew vendor bundle ruby gems warning lib usr local homebrew library homebrew vendor bundle ruby gems tapioca lib usr local homebrew library homebrew vendor bundle ruby gems yard sorbet lib usr local homebrew library homebrew vendor bundle ruby gems yard lib usr local homebrew library homebrew vendor bundle ruby gems spoom lib usr local homebrew library homebrew vendor bundle ruby gems thor lib usr local homebrew library homebrew vendor bundle ruby gems sorbet static and runtime lib usr local homebrew library homebrew vendor bundle ruby gems sorbet lib usr local homebrew library homebrew vendor bundle ruby gems sorbet static universal darwin lib usr local homebrew library homebrew vendor bundle ruby gems simplecov cobertura lib usr local homebrew library homebrew vendor bundle ruby gems simplecov lib usr local homebrew library homebrew vendor bundle ruby gems simplecov json formatter lib usr local homebrew library homebrew vendor bundle ruby gems simplecov html lib usr local homebrew library homebrew vendor bundle ruby gems ruby macho lib usr local homebrew library homebrew vendor bundle ruby gems rubocop sorbet lib usr local homebrew library homebrew vendor bundle ruby gems rubocop rspec lib usr local homebrew library homebrew vendor bundle ruby gems rubocop rails lib usr local homebrew library homebrew vendor bundle ruby gems rubocop performance lib usr local homebrew library homebrew vendor bundle ruby gems rubocop lib usr local homebrew library homebrew vendor bundle ruby gems unicode display width lib usr local homebrew library homebrew vendor bundle ruby gems ruby progressbar lib usr local homebrew library homebrew vendor bundle ruby gems rubocop ast lib usr local homebrew library homebrew vendor bundle ruby gems rspec junit formatter lib usr local homebrew library homebrew vendor bundle ruby gems rspec wait lib usr local homebrew library homebrew vendor bundle ruby gems rspec sorbet lib usr local homebrew library homebrew vendor bundle ruby gems rspec retry lib usr local homebrew library homebrew vendor bundle ruby gems rspec its lib usr local homebrew library homebrew vendor bundle ruby gems rspec github lib usr local homebrew library homebrew vendor bundle ruby gems rspec lib usr local homebrew library homebrew vendor bundle ruby gems rspec mocks lib usr local homebrew library homebrew vendor bundle ruby gems rspec expectations lib usr local homebrew library homebrew vendor bundle ruby gems rspec core lib usr local homebrew library homebrew vendor bundle ruby gems rspec support lib usr local homebrew library homebrew vendor bundle ruby gems ronn lib usr local homebrew library homebrew vendor bundle ruby gems rexml lib usr local homebrew library homebrew vendor bundle ruby gems rdiscount lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin rdiscount usr local homebrew library homebrew vendor bundle ruby gems rbi lib usr local homebrew library homebrew vendor bundle ruby gems unparser lib usr local homebrew library homebrew vendor bundle ruby gems rack lib usr local homebrew library homebrew vendor bundle ruby gems pry lib usr local homebrew library homebrew vendor bundle ruby gems plist lib usr local homebrew library homebrew vendor bundle ruby gems patchelf lib usr local homebrew library homebrew vendor bundle ruby gems parlour lib usr local homebrew library homebrew vendor bundle ruby gems sorbet runtime lib usr local homebrew library homebrew vendor bundle ruby gems rainbow lib usr local homebrew library homebrew vendor bundle ruby gems parser lib usr local homebrew library homebrew vendor bundle ruby gems parallel tests lib usr local homebrew library homebrew vendor bundle ruby gems parallel lib usr local homebrew library homebrew vendor bundle ruby gems mustache lib usr local homebrew library homebrew vendor bundle ruby gems method source lib usr local homebrew library homebrew vendor bundle ruby gems mechanize lib usr local homebrew library homebrew vendor bundle ruby gems webrobots lib usr local homebrew library homebrew vendor bundle ruby gems webrick lib usr local homebrew library homebrew vendor bundle ruby gems rubyntlm lib usr local homebrew library homebrew vendor bundle ruby gems nokogiri darwin lib usr local homebrew library homebrew vendor bundle ruby gems racc lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin racc usr local homebrew library homebrew vendor bundle ruby gems net http persistent lib usr local homebrew library homebrew vendor bundle ruby gems net http digest auth lib usr local homebrew library homebrew vendor bundle ruby gems mime types lib usr local homebrew library homebrew vendor bundle ruby gems mime types data lib usr local homebrew library homebrew vendor bundle ruby gems json schemer lib usr local homebrew library homebrew vendor bundle ruby gems uri template lib usr local homebrew library homebrew vendor bundle ruby gems json lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin json usr local homebrew library homebrew vendor bundle ruby gems http cookie lib usr local homebrew library homebrew vendor bundle ruby gems hpricot lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin hpricot usr local homebrew library homebrew vendor bundle ruby gems hana lib usr local homebrew library homebrew vendor bundle ruby gems elftools lib usr local homebrew library homebrew vendor bundle ruby gems ecma re validator lib usr local homebrew library homebrew vendor bundle ruby gems regexp parser lib usr local homebrew library homebrew vendor bundle ruby gems domain name lib usr local homebrew library homebrew vendor bundle ruby gems unf lib usr local homebrew library homebrew vendor bundle ruby gems unf ext lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin unf ext usr local homebrew library homebrew vendor bundle ruby gems docile lib usr local homebrew library homebrew vendor bundle ruby gems diff lcs lib usr local homebrew library homebrew vendor bundle ruby gems did you mean lib usr local homebrew library homebrew vendor bundle ruby gems connection pool lib usr local homebrew library homebrew vendor bundle ruby gems commander lib usr local homebrew library homebrew vendor bundle ruby gems highline lib usr local homebrew library homebrew vendor bundle ruby gems coderay lib usr local homebrew library homebrew vendor bundle ruby gems byebug lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin byebug usr local homebrew library homebrew vendor bundle ruby gems bootsnap lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin bootsnap usr local homebrew library homebrew vendor bundle ruby gems msgpack lib usr local homebrew library homebrew vendor bundle ruby extensions universal darwin msgpack usr local homebrew library homebrew vendor bundle ruby gems bindata lib usr local homebrew library homebrew vendor bundle ruby gems ast lib usr local homebrew library homebrew vendor bundle ruby gems addressable lib usr local homebrew library homebrew vendor bundle ruby gems public suffix lib usr local homebrew library homebrew vendor bundle ruby gems activesupport lib usr local homebrew library homebrew vendor bundle ruby gems zeitwerk lib usr local homebrew library homebrew vendor bundle ruby gems tzinfo lib usr local homebrew library homebrew vendor bundle ruby gems minitest lib usr local homebrew library homebrew vendor bundle ruby gems lib usr local homebrew library homebrew vendor bundle ruby gems concurrent ruby lib concurrent ruby system library frameworks ruby framework versions usr lib ruby vendor ruby system library frameworks ruby framework versions usr lib ruby vendor ruby system library frameworks ruby framework versions usr lib ruby vendor ruby universal system library frameworks ruby framework versions usr lib ruby vendor ruby system library frameworks ruby framework versions usr lib ruby system library frameworks ruby framework versions usr lib ruby system library frameworks ruby framework versions usr lib ruby universal usr local homebrew library homebrew homebrew version macos sip enabled homebrew cask staging location usr local caskroom homebrew cask taps usr local homebrew library taps homebrew homebrew cask casks your system is ready to brew output of brew tap shell homebrew bundle homebrew cask homebrew core | 1 |
55,687 | 30,886,002,491 | IssuesEvent | 2023-08-03 21:54:33 | JuliaLang/julia | https://api.github.com/repos/JuliaLang/julia | opened | Optimized/unrolled searchsortedfirst etc. | performance | Would be nice to have optimized methods for `searchsortedfirst` for tuples, which completely unrolls a binary search.
Not something that can be done in a package without renaming the function, due to type piracy. | True | Optimized/unrolled searchsortedfirst etc. - Would be nice to have optimized methods for `searchsortedfirst` for tuples, which completely unrolls a binary search.
Not something that can be done in a package without renaming the function, due to type piracy. | non_main | optimized unrolled searchsortedfirst etc would be nice to have optimized methods for searchsortedfirst for tuples which completely unrolls a binary search not something that can be done in a package without renaming the function due to type piracy | 0 |
2,291 | 8,160,258,566 | IssuesEvent | 2018-08-24 00:15:54 | tgstation/tgstation | https://api.github.com/repos/tgstation/tgstation | closed | Runtime in getrev.dm,18: list index out of bounds | GitHub Maintainability/Hinders improvements Runtime |
[02:43:16] Runtime in getrev.dm,18: list index out of bounds
proc name: New (/datum/getrev/New)
src: /datum/getrev (/datum/getrev)
call stack:
/datum/getrev (/datum/getrev): New()
Global Variables (/datum/controller/global_vars): InitGlobalrevdata()
Global Variables (/datum/controller/global_vars): Initialize()
Global Variables (/datum/controller/global_vars): New()
Master (/datum/controller/master): New()
world: ()
runtime error:
[02:43:16]list index out of bounds
[02:43:16]proc name: New (/datum/getrev/New)
[02:43:16] source file: getrev.dm,18
[02:43:16] usr: null
[02:43:16] src: /datum/getrev (/datum/getrev)
[02:43:16] call stack:
[02:43:16]/datum/getrev (/datum/getrev): New()
[02:43:16]Global Variables (/datum/controller/global_vars): InitGlobalrevdata()
[02:43:16]Global Variables (/datum/controller/global_vars): Initialize()
[02:43:16]Global Variables (/datum/controller/global_vars): New()
[02:43:16]Master (/datum/controller/master): New()
[02:43:16]world: ()
| True | Runtime in getrev.dm,18: list index out of bounds -
[02:43:16] Runtime in getrev.dm,18: list index out of bounds
proc name: New (/datum/getrev/New)
src: /datum/getrev (/datum/getrev)
call stack:
/datum/getrev (/datum/getrev): New()
Global Variables (/datum/controller/global_vars): InitGlobalrevdata()
Global Variables (/datum/controller/global_vars): Initialize()
Global Variables (/datum/controller/global_vars): New()
Master (/datum/controller/master): New()
world: ()
runtime error:
[02:43:16]list index out of bounds
[02:43:16]proc name: New (/datum/getrev/New)
[02:43:16] source file: getrev.dm,18
[02:43:16] usr: null
[02:43:16] src: /datum/getrev (/datum/getrev)
[02:43:16] call stack:
[02:43:16]/datum/getrev (/datum/getrev): New()
[02:43:16]Global Variables (/datum/controller/global_vars): InitGlobalrevdata()
[02:43:16]Global Variables (/datum/controller/global_vars): Initialize()
[02:43:16]Global Variables (/datum/controller/global_vars): New()
[02:43:16]Master (/datum/controller/master): New()
[02:43:16]world: ()
| main | runtime in getrev dm list index out of bounds runtime in getrev dm list index out of bounds proc name new datum getrev new src datum getrev datum getrev call stack datum getrev datum getrev new global variables datum controller global vars initglobalrevdata global variables datum controller global vars initialize global variables datum controller global vars new master datum controller master new world runtime error list index out of bounds proc name new datum getrev new source file getrev dm usr null src datum getrev datum getrev call stack datum getrev datum getrev new global variables datum controller global vars initglobalrevdata global variables datum controller global vars initialize global variables datum controller global vars new master datum controller master new world | 1 |
14,018 | 10,578,649,318 | IssuesEvent | 2019-10-07 23:27:04 | oppia/oppia-android | https://api.github.com/repos/oppia/oppia-android | closed | Introduce interface for ExplorationProgressController | Priority: Essential Status: In implementation Type: Improvement Where: Infrastructure Workstream: Domain Interface | This is tracking introducing a stubbed interface for #115 without the real implementation being complete. | 1.0 | Introduce interface for ExplorationProgressController - This is tracking introducing a stubbed interface for #115 without the real implementation being complete. | non_main | introduce interface for explorationprogresscontroller this is tracking introducing a stubbed interface for without the real implementation being complete | 0 |
2,854 | 10,242,750,487 | IssuesEvent | 2019-08-20 06:16:00 | pace/bricks | https://api.github.com/repos/pace/bricks | closed | Add label for all http request related stats to filter | T::Maintainance | In order to filter technical stats from the Grafana dashboards, add a label to all request related stats.
`Request-Source: (uptime|kubernetes|nginx|livetest)`
|Name|Description|
|-|-|
|*none*|Regular requests|
|`uptime`|Uptime Robot and similar sources|
|`kubernetes`|Kubernetes health/alive checks|
|`nginx`|Backend availability checks|
|`livetest`|Backend functional checks|
The set needs to be checked, otherwise, one could DOS our Prometheus my using random labels (explosion of TSDBs) | True | Add label for all http request related stats to filter - In order to filter technical stats from the Grafana dashboards, add a label to all request related stats.
`Request-Source: (uptime|kubernetes|nginx|livetest)`
|Name|Description|
|-|-|
|*none*|Regular requests|
|`uptime`|Uptime Robot and similar sources|
|`kubernetes`|Kubernetes health/alive checks|
|`nginx`|Backend availability checks|
|`livetest`|Backend functional checks|
The set needs to be checked, otherwise, one could DOS our Prometheus my using random labels (explosion of TSDBs) | main | add label for all http request related stats to filter in order to filter technical stats from the grafana dashboards add a label to all request related stats request source uptime kubernetes nginx livetest name description none regular requests uptime uptime robot and similar sources kubernetes kubernetes health alive checks nginx backend availability checks livetest backend functional checks the set needs to be checked otherwise one could dos our prometheus my using random labels explosion of tsdbs | 1 |
966 | 2,588,724,290 | IssuesEvent | 2015-02-18 05:05:09 | retailcoder/Rubberduck | https://api.github.com/repos/retailcoder/Rubberduck | opened | Add a context menu in the Code Explorer | code-explorer user-interface | Let's add a right-click context menu for the Code Explorer tree nodes.
- [ ] **Refresh**, to re-parse code changes into the tree view.
- [ ] **Navigate**, to activate a code pane for the selected source file - if module-level, selection - [ ] - - [ ] **Rename...**, to rename node / project / module / member, and update all references.in project.
should be (1,1,1,1).
- [ ] Add > **Class**, to add a new class module.
- [ ] Add > **Module**, to add a new standard module.
- [ ] Add > **Form**, to add a new form.
- [ ] **Inspect**, to run code inspections.
- [ ] Source Control > **Commit**, to save changes to local repository.
- [ ] Source Control > **Commit & Push**, to save changes to local repository and push to remote.
- [ ] Source Control > **Discard changes**, to reload the unchanged version, discarding all uncommitted changes.
| 1.0 | Add a context menu in the Code Explorer - Let's add a right-click context menu for the Code Explorer tree nodes.
- [ ] **Refresh**, to re-parse code changes into the tree view.
- [ ] **Navigate**, to activate a code pane for the selected source file - if module-level, selection - [ ] - - [ ] **Rename...**, to rename node / project / module / member, and update all references.in project.
should be (1,1,1,1).
- [ ] Add > **Class**, to add a new class module.
- [ ] Add > **Module**, to add a new standard module.
- [ ] Add > **Form**, to add a new form.
- [ ] **Inspect**, to run code inspections.
- [ ] Source Control > **Commit**, to save changes to local repository.
- [ ] Source Control > **Commit & Push**, to save changes to local repository and push to remote.
- [ ] Source Control > **Discard changes**, to reload the unchanged version, discarding all uncommitted changes.
| non_main | add a context menu in the code explorer let s add a right click context menu for the code explorer tree nodes refresh to re parse code changes into the tree view navigate to activate a code pane for the selected source file if module level selection rename to rename node project module member and update all references in project should be add class to add a new class module add module to add a new standard module add form to add a new form inspect to run code inspections source control commit to save changes to local repository source control commit push to save changes to local repository and push to remote source control discard changes to reload the unchanged version discarding all uncommitted changes | 0 |
98,233 | 20,622,582,543 | IssuesEvent | 2022-03-07 18:56:20 | senthilrch/kube-fledged | https://api.github.com/repos/senthilrch/kube-fledged | closed | Add README.md to kube-fledged helm chart | enhancement good first issue SODACODE2022 SODALOW | It is a good practice to include README.md in helm chart. Add a README.md file to kube-fledged's helm chart located in:
./deploy/kubefledged-operator/helm-charts/kubefledged
The README should have following info:-
- Short description of kube-fledged
kube-fledged is a kubernetes operator for creating and managing a cache of container images directly on the worker nodes of a kubernetes cluster. It allows a user to define a list of images and onto which worker nodes those images should be cached (i.e. pulled). As a result, application pods start almost instantly, since the images need not be pulled from the registry. kube-fledged provides CRUD APIs to manage the lifecycle of the image cache, and supports several configurable parameters to customize the functioning as per one's needs.
- How to install the chart (https://github.com/senthilrch/kube-fledged#quick-install-using-helm-chart)
- Chart parameters (https://github.com/senthilrch/kube-fledged/blob/master/docs/helm-parameters.md) | 1.0 | Add README.md to kube-fledged helm chart - It is a good practice to include README.md in helm chart. Add a README.md file to kube-fledged's helm chart located in:
./deploy/kubefledged-operator/helm-charts/kubefledged
The README should have following info:-
- Short description of kube-fledged
kube-fledged is a kubernetes operator for creating and managing a cache of container images directly on the worker nodes of a kubernetes cluster. It allows a user to define a list of images and onto which worker nodes those images should be cached (i.e. pulled). As a result, application pods start almost instantly, since the images need not be pulled from the registry. kube-fledged provides CRUD APIs to manage the lifecycle of the image cache, and supports several configurable parameters to customize the functioning as per one's needs.
- How to install the chart (https://github.com/senthilrch/kube-fledged#quick-install-using-helm-chart)
- Chart parameters (https://github.com/senthilrch/kube-fledged/blob/master/docs/helm-parameters.md) | non_main | add readme md to kube fledged helm chart it is a good practice to include readme md in helm chart add a readme md file to kube fledged s helm chart located in deploy kubefledged operator helm charts kubefledged the readme should have following info short description of kube fledged kube fledged is a kubernetes operator for creating and managing a cache of container images directly on the worker nodes of a kubernetes cluster it allows a user to define a list of images and onto which worker nodes those images should be cached i e pulled as a result application pods start almost instantly since the images need not be pulled from the registry kube fledged provides crud apis to manage the lifecycle of the image cache and supports several configurable parameters to customize the functioning as per one s needs how to install the chart chart parameters | 0 |
8,587 | 11,757,480,932 | IssuesEvent | 2020-03-13 13:46:06 | NationalSecurityAgency/ghidra | https://api.github.com/repos/NationalSecurityAgency/ghidra | closed | [M68000] decompiler cannot create correct array assignments | Feature: Processor/68000 Type: Bug | The M68000 decompiler incorrectly translates the following code fragment:
```
0402cf06 30 2d ff move.w (i,A5),D0w
fe
0402cf0a 48 c0 ext.l D0
0402cf0c e3 80 asl.l #0x1,D0
0402cf0e 41 f9 04 lea (SHORT_ARRAY_0403f996).l,A0 =
03 f9 96
0402cf14 32 2d ff move.w (i,A5),D1w
fe
0402cf18 48 c1 ext.l D1
0402cf1a e3 81 asl.l #0x1,D1
0402cf1c 43 f9 04 lea (SHORT_ARRAY_04042364).l,A1 =
04 23 64
0402cf22 33 b0 08 move.w (0x0,A0,D0*0x1),(0x0,A1,D1*0x1)=>SHORT_AR =
00 18 00
```
It generates a "self" assignment, ignoring the source:
```
SHORT_ARRAY_04042364[i] = SHORT_ARRAY_04042364[i];
```
This would be correct:
```
SHORT_ARRAY_04042364[i] = SHORT_ARRAY_0403f996[i]
```
Somehow it "forgets" the A0 address and uses the A1 address twice.
**Environment (please complete the following information):**
- OS: macOS 10.15.2
- Java Version: 13.0
- Ghidra Version: 9.1.1
| 1.0 | [M68000] decompiler cannot create correct array assignments - The M68000 decompiler incorrectly translates the following code fragment:
```
0402cf06 30 2d ff move.w (i,A5),D0w
fe
0402cf0a 48 c0 ext.l D0
0402cf0c e3 80 asl.l #0x1,D0
0402cf0e 41 f9 04 lea (SHORT_ARRAY_0403f996).l,A0 =
03 f9 96
0402cf14 32 2d ff move.w (i,A5),D1w
fe
0402cf18 48 c1 ext.l D1
0402cf1a e3 81 asl.l #0x1,D1
0402cf1c 43 f9 04 lea (SHORT_ARRAY_04042364).l,A1 =
04 23 64
0402cf22 33 b0 08 move.w (0x0,A0,D0*0x1),(0x0,A1,D1*0x1)=>SHORT_AR =
00 18 00
```
It generates a "self" assignment, ignoring the source:
```
SHORT_ARRAY_04042364[i] = SHORT_ARRAY_04042364[i];
```
This would be correct:
```
SHORT_ARRAY_04042364[i] = SHORT_ARRAY_0403f996[i]
```
Somehow it "forgets" the A0 address and uses the A1 address twice.
**Environment (please complete the following information):**
- OS: macOS 10.15.2
- Java Version: 13.0
- Ghidra Version: 9.1.1
| non_main | decompiler cannot create correct array assignments the decompiler incorrectly translates the following code fragment ff move w i fe ext l asl l lea short array l ff move w i fe ext l asl l lea short array l move w short ar it generates a self assignment ignoring the source short array short array this would be correct short array short array somehow it forgets the address and uses the address twice environment please complete the following information os macos java version ghidra version | 0 |
733,424 | 25,305,603,150 | IssuesEvent | 2022-11-17 13:58:30 | Hexlet/runit | https://api.github.com/repos/Hexlet/runit | opened | Fix style | good first issue help wanted frontend Priority: High | Now the wrong styles in the modal when the user is not registered. We need to do what we do elsewhere
<img width="1440" alt="image" src="https://user-images.githubusercontent.com/77053797/202465262-b3ea3772-8d36-4d22-a11b-8932ffc4a983.png">
| 1.0 | Fix style - Now the wrong styles in the modal when the user is not registered. We need to do what we do elsewhere
<img width="1440" alt="image" src="https://user-images.githubusercontent.com/77053797/202465262-b3ea3772-8d36-4d22-a11b-8932ffc4a983.png">
| non_main | fix style now the wrong styles in the modal when the user is not registered we need to do what we do elsewhere img width alt image src | 0 |
2,456 | 8,639,880,230 | IssuesEvent | 2018-11-23 22:20:00 | F5OEO/rpitx | https://api.github.com/repos/F5OEO/rpitx | closed | Transmitting AM from RFA is confusing/badly documented | V1 related (not maintained) enhancement | I was trying to transmit a simple AM modulated signal by writing the RFA file from my custom C program.
The file format is described in the help:
```
-m {IQ(FileInput is a Stereo Wav contains I on left Channel, Q on right channel)}
{IQFLOAT(FileInput is a Raw float interlaced I,Q)}
{RF(FileInput is a (double)Frequency,Time in nanoseconds}
{RFA(FileInput is a (double)Frequency,(int)Time in nanoseconds,(float)Amplitude}
{VFO (constant frequency)}
```
There are no further definitions on what are the ranges for these values. By looking at the sample files and code of various rpitx programs I was able to deduce the following:
- RF and RFA file format is identical, ie. even the RF file has the amplitude field "(double) frequency; (int) time; **(float) amplitude**;", at least it looks that way
- RFA does NOT use the amplitude field to store the amplitude value, it instead hijacks the frequency field for this purpose. Here is a snippet from [piam.c](https://github.com/F5OEO/rpitx/blob/master/am/piam.c):
```
typedef struct {
double Frequency;
uint32_t WaitForThisSample;
} samplerf_t;
samplerf_t RfSample;
```
- The only reason the file has a (float) sized field in it is because the structure above needs to be properly aligned to the 64bit memory boundary, hence the compiler enlarges structure size to 16B instead of 12B.
`readWrapper(&SampleRf,sizeof(samplerf_t)); \* sizeof() will return 16, not 12! *\`
- Finally the amplitude values should be in the range <0 , 32767>, anything in between is emulated by OOK.
Please document how AM mode is implemented and used, I have spent two good evenings trying to understand all of this :-)
Also the struct holding frequency and delay will likely produce incompatible files on 32bit and Big Endian CPUs...
Thanks! | True | Transmitting AM from RFA is confusing/badly documented - I was trying to transmit a simple AM modulated signal by writing the RFA file from my custom C program.
The file format is described in the help:
```
-m {IQ(FileInput is a Stereo Wav contains I on left Channel, Q on right channel)}
{IQFLOAT(FileInput is a Raw float interlaced I,Q)}
{RF(FileInput is a (double)Frequency,Time in nanoseconds}
{RFA(FileInput is a (double)Frequency,(int)Time in nanoseconds,(float)Amplitude}
{VFO (constant frequency)}
```
There are no further definitions on what are the ranges for these values. By looking at the sample files and code of various rpitx programs I was able to deduce the following:
- RF and RFA file format is identical, ie. even the RF file has the amplitude field "(double) frequency; (int) time; **(float) amplitude**;", at least it looks that way
- RFA does NOT use the amplitude field to store the amplitude value, it instead hijacks the frequency field for this purpose. Here is a snippet from [piam.c](https://github.com/F5OEO/rpitx/blob/master/am/piam.c):
```
typedef struct {
double Frequency;
uint32_t WaitForThisSample;
} samplerf_t;
samplerf_t RfSample;
```
- The only reason the file has a (float) sized field in it is because the structure above needs to be properly aligned to the 64bit memory boundary, hence the compiler enlarges structure size to 16B instead of 12B.
`readWrapper(&SampleRf,sizeof(samplerf_t)); \* sizeof() will return 16, not 12! *\`
- Finally the amplitude values should be in the range <0 , 32767>, anything in between is emulated by OOK.
Please document how AM mode is implemented and used, I have spent two good evenings trying to understand all of this :-)
Also the struct holding frequency and delay will likely produce incompatible files on 32bit and Big Endian CPUs...
Thanks! | main | transmitting am from rfa is confusing badly documented i was trying to transmit a simple am modulated signal by writing the rfa file from my custom c program the file format is described in the help m iq fileinput is a stereo wav contains i on left channel q on right channel iqfloat fileinput is a raw float interlaced i q rf fileinput is a double frequency time in nanoseconds rfa fileinput is a double frequency int time in nanoseconds float amplitude vfo constant frequency there are no further definitions on what are the ranges for these values by looking at the sample files and code of various rpitx programs i was able to deduce the following rf and rfa file format is identical ie even the rf file has the amplitude field double frequency int time float amplitude at least it looks that way rfa does not use the amplitude field to store the amplitude value it instead hijacks the frequency field for this purpose here is a snippet from typedef struct double frequency t waitforthissample samplerf t samplerf t rfsample the only reason the file has a float sized field in it is because the structure above needs to be properly aligned to the memory boundary hence the compiler enlarges structure size to instead of readwrapper samplerf sizeof samplerf t sizeof will return not finally the amplitude values should be in the range anything in between is emulated by ook please document how am mode is implemented and used i have spent two good evenings trying to understand all of this also the struct holding frequency and delay will likely produce incompatible files on and big endian cpus thanks | 1 |
407,610 | 27,622,329,747 | IssuesEvent | 2023-03-10 01:55:38 | PubInv/general-alarm-device | https://api.github.com/repos/PubInv/general-alarm-device | opened | Add Import DailyStruggleButton library to the list of required in the Arduino IDE | documentation firmware | Add to a the appropriate readme.md instructions for the future developers that this project uses DailyStruggleButton library.
Instructions for finding and loading the library would be nice. | 1.0 | Add Import DailyStruggleButton library to the list of required in the Arduino IDE - Add to a the appropriate readme.md instructions for the future developers that this project uses DailyStruggleButton library.
Instructions for finding and loading the library would be nice. | non_main | add import dailystrugglebutton library to the list of required in the arduino ide add to a the appropriate readme md instructions for the future developers that this project uses dailystrugglebutton library instructions for finding and loading the library would be nice | 0 |
91,027 | 8,288,387,839 | IssuesEvent | 2018-09-19 11:51:22 | humera987/HumTestData | https://api.github.com/repos/humera987/HumTestData | opened | fx_test_proj : vault_create_valid_val | fx_test_proj | Project : fx_test_proj
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Wed, 19 Sep 2018 11:51:11 GMT]}
Endpoint : http://13.56.210.25/vault
Request :
{
"key" : "TFKe",
"val" : "52hk74UY"
}
Response :
{
"timestamp" : "2018-09-19T11:51:11.999+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/vault"
}
Logs :
Assertion [@Response.data.key == @Request.key] resolved-to [ == TFKe] result [Failed]Assertion [@Response.data.val == PASSWORD-MASKED] resolved-to [ == PASSWORD-MASKED] result [Failed]Assertion [@Response.errors == false] resolved-to [ == false] result [Failed]Assertion [@StatusCode == 200] resolved-to [404 == 200] result [Failed]
--- FX Bot --- | 1.0 | fx_test_proj : vault_create_valid_val - Project : fx_test_proj
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Wed, 19 Sep 2018 11:51:11 GMT]}
Endpoint : http://13.56.210.25/vault
Request :
{
"key" : "TFKe",
"val" : "52hk74UY"
}
Response :
{
"timestamp" : "2018-09-19T11:51:11.999+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/vault"
}
Logs :
Assertion [@Response.data.key == @Request.key] resolved-to [ == TFKe] result [Failed]Assertion [@Response.data.val == PASSWORD-MASKED] resolved-to [ == PASSWORD-MASKED] result [Failed]Assertion [@Response.errors == false] resolved-to [ == false] result [Failed]Assertion [@StatusCode == 200] resolved-to [404 == 200] result [Failed]
--- FX Bot --- | non_main | fx test proj vault create valid val project fx test proj job uat env uat region fxlabs us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options content type transfer encoding date endpoint request key tfke val response timestamp status error not found message no message available path vault logs assertion resolved to result assertion resolved to result assertion resolved to result assertion resolved to result fx bot | 0 |
12,580 | 14,892,495,934 | IssuesEvent | 2021-01-21 02:59:55 | P03W/RPGStats | https://api.github.com/repos/P03W/RPGStats | closed | Support for Harvest Scythes mod | enhancement mod compatibility | Hi! I have the Harvest Scythe mod installed, and when it's used to harvest crops, it doesn't give the farming stat the experience for harvesting. If support could be added for that, that would be great. Thanks! | True | Support for Harvest Scythes mod - Hi! I have the Harvest Scythe mod installed, and when it's used to harvest crops, it doesn't give the farming stat the experience for harvesting. If support could be added for that, that would be great. Thanks! | non_main | support for harvest scythes mod hi i have the harvest scythe mod installed and when it s used to harvest crops it doesn t give the farming stat the experience for harvesting if support could be added for that that would be great thanks | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.