Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
648,422
| 21,185,664,414
|
IssuesEvent
|
2022-04-08 12:29:35
|
pa11y/org
|
https://api.github.com/repos/pa11y/org
|
closed
|
Bump all the required version to node 12+
|
type: maintenance priority: medium
|
[Node 8 is unsupported as of 31st Dec 2019](https://github.com/nodejs/Release). We should try to ensure that all of the pa11y apps work fine with node 10+ in order to avoid as much as possible using unsupported versions, and drop support for node 8 whenever possible.
|
1.0
|
Bump all the required version to node 12+ - [Node 8 is unsupported as of 31st Dec 2019](https://github.com/nodejs/Release). We should try to ensure that all of the pa11y apps work fine with node 10+ in order to avoid as much as possible using unsupported versions, and drop support for node 8 whenever possible.
|
non_process
|
bump all the required version to node we should try to ensure that all of the apps work fine with node in order to avoid as much as possible using unsupported versions and drop support for node whenever possible
| 0
|
17,045
| 10,594,209,852
|
IssuesEvent
|
2019-10-09 16:16:36
|
microsoft/BotFramework-Services
|
https://api.github.com/repos/microsoft/BotFramework-Services
|
closed
|
Streaming extensions: Conversation ID should not be required for starting a conversation
|
4.6 Azure Bot Service P1 Streaming Extensions
|
When comparing WS/REST vs. Streaming Extensions (a.k.a. SE), using token with conversation ID burnt (JWT token).
- WS/REST
- If conversation ID is not presented, it means creating or joining a conversation (POST)
- The result will be a JSON with the conversation ID same as the one in the token
- If conversation is already started, this call will succeed with HTTP 200
- If conversation ID is presented, it means joining a conversation (GET)
- SE
- If conversation ID is not presented, it is an error (`Incorrect conversation id in request`)
On the WS/REST API side, whether conversation ID is presented means either creating or joining a conversation. But on SE API, conversation ID is always required.
Although these 3 protocols are not exactly equal, we should reduce inconsistencies as much as we could, so we would reduce confusion from our customers' standpoint.
|
1.0
|
Streaming extensions: Conversation ID should not be required for starting a conversation - When comparing WS/REST vs. Streaming Extensions (a.k.a. SE), using token with conversation ID burnt (JWT token).
- WS/REST
- If conversation ID is not presented, it means creating or joining a conversation (POST)
- The result will be a JSON with the conversation ID same as the one in the token
- If conversation is already started, this call will succeed with HTTP 200
- If conversation ID is presented, it means joining a conversation (GET)
- SE
- If conversation ID is not presented, it is an error (`Incorrect conversation id in request`)
On the WS/REST API side, whether conversation ID is presented means either creating or joining a conversation. But on SE API, conversation ID is always required.
Although these 3 protocols are not exactly equal, we should reduce inconsistencies as much as we could, so we would reduce confusion from our customers' standpoint.
|
non_process
|
streaming extensions conversation id should not be required for starting a conversation when comparing ws rest vs streaming extensions a k a se using token with conversation id burnt jwt token ws rest if conversation id is not presented it means creating or joining a conversation post the result will be a json with the conversation id same as the one in the token if conversation is already started this call will succeed with http if conversation id is presented it means joining a conversation get se if conversation id is not presented it is an error incorrect conversation id in request on the ws rest api side whether conversation id is presented means either creating or joining a conversation but on se api conversation id is always required although these protocols are not exactly equal we should reduce inconsistencies as much as we could so we would reduce confusion from our customers standpoint
| 0
|
21,121
| 28,090,461,612
|
IssuesEvent
|
2023-03-30 12:45:20
|
microprofile/microprofile-wg
|
https://api.github.com/repos/microprofile/microprofile-wg
|
closed
|
[Specification Process 1]: MicroProfile Platform release
|
Specification Process program plan
|
Reach a decision on status of MicroProfile platform
* Specification with TCK and compatible implementation
* Marketing-only document
|
1.0
|
[Specification Process 1]: MicroProfile Platform release - Reach a decision on status of MicroProfile platform
* Specification with TCK and compatible implementation
* Marketing-only document
|
process
|
microprofile platform release reach a decision on status of microprofile platform specification with tck and compatible implementation marketing only document
| 1
|
8,844
| 11,949,421,602
|
IssuesEvent
|
2020-04-03 13:37:53
|
peopledoc/procrastinate
|
https://api.github.com/repos/peopledoc/procrastinate
|
reopened
|
DeprecationWarning with Python 3.8
|
Type: Process
|
While running the tests on Python 3.8 we get a `DeprecationWarning`:
```
.../procrastinate/lib/python3.8/site-packages/aiopg/connection.py:90: DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal
in Python 3.10.
```
We get this warning several times.
|
1.0
|
DeprecationWarning with Python 3.8 - While running the tests on Python 3.8 we get a `DeprecationWarning`:
```
.../procrastinate/lib/python3.8/site-packages/aiopg/connection.py:90: DeprecationWarning: The loop argument is deprecated since Python 3.8, and scheduled for removal
in Python 3.10.
```
We get this warning several times.
|
process
|
deprecationwarning with python while running the tests on python we get a deprecationwarning procrastinate lib site packages aiopg connection py deprecationwarning the loop argument is deprecated since python and scheduled for removal in python we get this warning several times
| 1
|
15,614
| 19,753,059,992
|
IssuesEvent
|
2022-01-15 09:03:04
|
googleapis/google-cloud-ruby
|
https://api.github.com/repos/googleapis/google-cloud-ruby
|
opened
|
Your .repo-metadata.json files have a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* must have required property 'library_type' in gcloud/.repo-metadata.json
* must have required property 'release_level' in gcloud/.repo-metadata.json
* must have required property 'release_level' in google-analytics-admin-v1alpha/.repo-metadata.json
* api_shortname field missing from google-analytics-admin-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-analytics-admin/.repo-metadata.json
* api_shortname field missing from google-analytics-admin/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data-v1alpha/.repo-metadata.json
* api_shortname field missing from google-analytics-data-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data-v1beta/.repo-metadata.json
* api_shortname field missing from google-analytics-data-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data/.repo-metadata.json
* api_shortname field missing from google-analytics-data/.repo-metadata.json
* must have required property 'release_level' in google-area120-tables-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-area120-tables-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-area120-tables/.repo-metadata.json
* api_shortname field missing from google-area120-tables/.repo-metadata.json
* must have required property 'release_level' in google-cloud-access_approval-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-access_approval-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-access_approval/.repo-metadata.json
* api_shortname field missing from google-cloud-access_approval/.repo-metadata.json
* must have required property 'release_level' in google-cloud-api_gateway-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-api_gateway-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-api_gateway/.repo-metadata.json
* api_shortname field missing from google-cloud-api_gateway/.repo-metadata.json
* must have required property 'release_level' in google-cloud-apigee_connect-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-apigee_connect-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-apigee_connect/.repo-metadata.json
* api_shortname field missing from google-cloud-apigee_connect/.repo-metadata.json
* must have required property 'release_level' in google-cloud-app_engine-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-app_engine-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-app_engine/.repo-metadata.json
* api_shortname field missing from google-cloud-app_engine/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-asset-v1/.repo-metadata.json
* must have required property 'library_type' in google-cloud-asset-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset/.repo-metadata.json
* api_shortname field missing from google-cloud-asset/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-automl-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-automl-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl/.repo-metadata.json
* api_shortname field missing from google-cloud-automl/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-connection-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-connection-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-connection/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-connection/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-data_transfer-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-data_transfer-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-data_transfer/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-data_transfer/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-reservation-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-reservation-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-reservation/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-reservation/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-storage-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-storage-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-storage/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-storage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable-admin-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable-admin-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing/.repo-metadata.json
* api_shortname field missing from google-cloud-billing/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization/.repo-metadata.json
* must have required property 'release_level' in google-cloud-build-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-build-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-build/.repo-metadata.json
* api_shortname field missing from google-cloud-build/.repo-metadata.json
* must have required property 'release_level' in google-cloud-channel-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-channel-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-channel/.repo-metadata.json
* api_shortname field missing from google-cloud-channel/.repo-metadata.json
* must have required property 'release_level' in google-cloud-cloud_dms-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-cloud_dms-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-cloud_dms/.repo-metadata.json
* api_shortname field missing from google-cloud-cloud_dms/.repo-metadata.json
* must have required property 'release_level' in google-cloud-compute-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-compute-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-contact_center_insights-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-contact_center_insights-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-contact_center_insights/.repo-metadata.json
* api_shortname field missing from google-cloud-contact_center_insights/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-container-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-container-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container/.repo-metadata.json
* api_shortname field missing from google-cloud-container/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container_analysis-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-container_analysis-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container_analysis/.repo-metadata.json
* api_shortname field missing from google-cloud-container_analysis/.repo-metadata.json
* must have required property 'release_level' in google-cloud-core/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_catalog-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_catalog-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_catalog/.repo-metadata.json
* api_shortname field missing from google-cloud-data_catalog/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_fusion-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_fusion-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_fusion/.repo-metadata.json
* api_shortname field missing from google-cloud-data_fusion/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_labeling-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_labeling-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_labeling/.repo-metadata.json
* api_shortname field missing from google-cloud-data_labeling/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataflow-v1beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-dataflow-v1beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataflow/.repo-metadata.json
* api_shortname field missing from google-cloud-dataflow/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataqna-v1alpha/.repo-metadata.json
* api_shortname field missing from google-cloud-dataqna-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataqna/.repo-metadata.json
* api_shortname field missing from google-cloud-dataqna/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore-admin-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore-admin-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastream-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastream-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastream/.repo-metadata.json
* api_shortname field missing from google-cloud-datastream/.repo-metadata.json
* must have required property 'release_level' in google-cloud-debugger-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-debugger-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-debugger/.repo-metadata.json
* must have required property 'release_level' in google-cloud-deploy-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-deploy-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-deploy/.repo-metadata.json
* api_shortname field missing from google-cloud-deploy/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-cx-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-cx-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-cx/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-cx/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dlp-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-dlp-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dlp/.repo-metadata.json
* api_shortname field missing from google-cloud-dlp/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dns/.repo-metadata.json
* api_shortname field missing from google-cloud-dns/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai-v1beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai-v1beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai/.repo-metadata.json
* must have required property 'release_level' in google-cloud-domains-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-domains-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-domains/.repo-metadata.json
* api_shortname field missing from google-cloud-domains/.repo-metadata.json
* must have required property 'release_level' in google-cloud-error_reporting-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-error_reporting-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-error_reporting/.repo-metadata.json
* must have required property 'release_level' in google-cloud-errors/.repo-metadata.json
* must have required property 'release_level' in google-cloud-essential_contacts-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-essential_contacts-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-essential_contacts/.repo-metadata.json
* api_shortname field missing from google-cloud-essential_contacts/.repo-metadata.json
* must have required property 'release_level' in google-cloud-eventarc-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-eventarc-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-eventarc/.repo-metadata.json
* api_shortname field missing from google-cloud-eventarc/.repo-metadata.json
* must have required property 'release_level' in google-cloud-filestore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-filestore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-filestore/.repo-metadata.json
* api_shortname field missing from google-cloud-filestore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore-admin-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore-admin-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-functions-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-functions-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-functions/.repo-metadata.json
* api_shortname field missing from google-cloud-functions/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gaming-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-gaming-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gaming/.repo-metadata.json
* api_shortname field missing from google-cloud-gaming/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_connect-gateway/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_connect-gateway/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iap-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-iap-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iap/.repo-metadata.json
* api_shortname field missing from google-cloud-iap/.repo-metadata.json
* must have required property 'release_level' in google-cloud-ids-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-ids-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-ids/.repo-metadata.json
* api_shortname field missing from google-cloud-ids/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iot-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-iot-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iot/.repo-metadata.json
* api_shortname field missing from google-cloud-iot/.repo-metadata.json
* must have required property 'release_level' in google-cloud-kms-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-kms-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-kms/.repo-metadata.json
* api_shortname field missing from google-cloud-kms/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-language-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-language-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language/.repo-metadata.json
* api_shortname field missing from google-cloud-language/.repo-metadata.json
* must have required property 'release_level' in google-cloud-life_sciences-v2beta/.repo-metadata.json
* api_shortname field missing from google-cloud-life_sciences-v2beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-life_sciences/.repo-metadata.json
* api_shortname field missing from google-cloud-life_sciences/.repo-metadata.json
* must have required property 'release_level' in google-cloud-location/.repo-metadata.json
* api_shortname field missing from google-cloud-location/.repo-metadata.json
* must have required property 'release_level' in google-cloud-logging-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-logging-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-logging/.repo-metadata.json
* api_shortname field missing from google-cloud-logging/.repo-metadata.json
* must have required property 'release_level' in google-cloud-managed_identities-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-managed_identities-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-managed_identities/.repo-metadata.json
* api_shortname field missing from google-cloud-managed_identities/.repo-metadata.json
* must have required property 'release_level' in google-cloud-media_translation-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-media_translation-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-media_translation/.repo-metadata.json
* api_shortname field missing from google-cloud-media_translation/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-dashboard-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-dashboard-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_management-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_management-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_management/.repo-metadata.json
* api_shortname field missing from google-cloud-network_management/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_security-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_security-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_security/.repo-metadata.json
* api_shortname field missing from google-cloud-network_security/.repo-metadata.json
* must have required property 'release_level' in google-cloud-notebooks-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-notebooks-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-notebooks/.repo-metadata.json
* api_shortname field missing from google-cloud-notebooks/.repo-metadata.json
* must have required property 'release_level' in google-cloud-orchestration-airflow-service-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-orchestration-airflow-service-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-orchestration-airflow-service/.repo-metadata.json
* api_shortname field missing from google-cloud-orchestration-airflow-service/.repo-metadata.json
* must have required property 'release_level' in google-cloud-org_policy-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-org_policy-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-org_policy/.repo-metadata.json
* api_shortname field missing from google-cloud-org_policy/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config-v1alpha/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login/.repo-metadata.json
* must have required property 'release_level' in google-cloud-phishing_protection-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-phishing_protection-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-phishing_protection/.repo-metadata.json
* api_shortname field missing from google-cloud-phishing_protection/.repo-metadata.json
* must have required property 'release_level' in google-cloud-policy_troubleshooter-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-policy_troubleshooter-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-policy_troubleshooter/.repo-metadata.json
* api_shortname field missing from google-cloud-policy_troubleshooter/.repo-metadata.json
* must have required property 'release_level' in google-cloud-private_catalog-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-private_catalog-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-private_catalog/.repo-metadata.json
* api_shortname field missing from google-cloud-private_catalog/.repo-metadata.json
* must have required property 'release_level' in google-cloud-profiler-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-profiler-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-profiler/.repo-metadata.json
* api_shortname field missing from google-cloud-profiler/.repo-metadata.json
* must have required property 'release_level' in google-cloud-pubsub-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-pubsub-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-pubsub/.repo-metadata.json
* api_shortname field missing from google-cloud-pubsub/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommendation_engine-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-recommendation_engine-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommendation_engine/.repo-metadata.json
* api_shortname field missing from google-cloud-recommendation_engine/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommender-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-recommender-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommender/.repo-metadata.json
* api_shortname field missing from google-cloud-recommender/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-redis-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-redis-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis/.repo-metadata.json
* api_shortname field missing from google-cloud-redis/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_manager-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_manager-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_manager/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_manager/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_settings-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_settings-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_settings/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_settings/.repo-metadata.json
* must have required property 'release_level' in google-cloud-retail-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-retail-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-retail/.repo-metadata.json
* api_shortname field missing from google-cloud-retail/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_control-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_control-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_control/.repo-metadata.json
* api_shortname field missing from google-cloud-service_control/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_management-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_management-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_management/.repo-metadata.json
* api_shortname field missing from google-cloud-service_management/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_usage-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_usage-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_usage/.repo-metadata.json
* api_shortname field missing from google-cloud-service_usage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-shell-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-shell-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-shell/.repo-metadata.json
* api_shortname field missing from google-cloud-shell/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-admin-database-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-admin-database-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-admin-instance-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-admin-instance-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-speech-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-speech-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech/.repo-metadata.json
* api_shortname field missing from google-cloud-speech/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage/.repo-metadata.json
* api_shortname field missing from google-cloud-storage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage_transfer-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-storage_transfer-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage_transfer/.repo-metadata.json
* api_shortname field missing from google-cloud-storage_transfer/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent-v4/.repo-metadata.json
* api_shortname field missing from google-cloud-talent-v4/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent-v4beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-talent-v4beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent/.repo-metadata.json
* api_shortname field missing from google-cloud-talent/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tpu-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-tpu-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tpu/.repo-metadata.json
* api_shortname field missing from google-cloud-tpu/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-trace-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-trace-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-translate-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-translate-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate/.repo-metadata.json
* api_shortname field missing from google-cloud-translate/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1p3beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1p4beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1p4beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision/.repo-metadata.json
* api_shortname field missing from google-cloud-vision/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vm_migration-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vm_migration-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vm_migration/.repo-metadata.json
* api_shortname field missing from google-cloud-vm_migration/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vpc_access-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vpc_access-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vpc_access/.repo-metadata.json
* api_shortname field missing from google-cloud-vpc_access/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner/.repo-metadata.json
* must have required property 'release_level' in google-cloud-webrisk/.repo-metadata.json
* api_shortname field missing from google-cloud-webrisk/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-executions-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-executions-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-executions-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-executions-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows/.repo-metadata.json
* must have required property 'library_type' in google-cloud/.repo-metadata.json
* must have required property 'release_level' in google-cloud/.repo-metadata.json
* must have required property 'release_level' in google-iam-credentials-v1/.repo-metadata.json
* api_shortname field missing from google-iam-credentials-v1/.repo-metadata.json
* must have required property 'release_level' in google-iam-credentials/.repo-metadata.json
* api_shortname field missing from google-iam-credentials/.repo-metadata.json
* must have required property 'release_level' in google-iam-v1beta/.repo-metadata.json
* api_shortname field missing from google-iam-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-identity-access_context_manager-v1/.repo-metadata.json
* api_shortname field missing from google-identity-access_context_manager-v1/.repo-metadata.json
* must have required property 'release_level' in google-identity-access_context_manager/.repo-metadata.json
* api_shortname field missing from google-identity-access_context_manager/.repo-metadata.json
* must have required property 'library_type' in grafeas-client/.repo-metadata.json
* must have required property 'release_level' in grafeas-client/.repo-metadata.json
* must have required property 'release_level' in grafeas-v1/.repo-metadata.json
* api_shortname field missing from grafeas-v1/.repo-metadata.json
* must have required property 'release_level' in grafeas/.repo-metadata.json
* api_shortname field missing from grafeas/.repo-metadata.json
* must have required property 'library_type' in stackdriver-core/.repo-metadata.json
* must have required property 'release_level' in stackdriver-core/.repo-metadata.json
* must have required property 'library_type' in stackdriver/.repo-metadata.json
* must have required property 'release_level' in stackdriver/.repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* must have required property 'library_type' in gcloud/.repo-metadata.json
* must have required property 'release_level' in gcloud/.repo-metadata.json
* must have required property 'release_level' in google-analytics-admin-v1alpha/.repo-metadata.json
* api_shortname field missing from google-analytics-admin-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-analytics-admin/.repo-metadata.json
* api_shortname field missing from google-analytics-admin/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data-v1alpha/.repo-metadata.json
* api_shortname field missing from google-analytics-data-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data-v1beta/.repo-metadata.json
* api_shortname field missing from google-analytics-data-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-analytics-data/.repo-metadata.json
* api_shortname field missing from google-analytics-data/.repo-metadata.json
* must have required property 'release_level' in google-area120-tables-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-area120-tables-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-area120-tables/.repo-metadata.json
* api_shortname field missing from google-area120-tables/.repo-metadata.json
* must have required property 'release_level' in google-cloud-access_approval-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-access_approval-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-access_approval/.repo-metadata.json
* api_shortname field missing from google-cloud-access_approval/.repo-metadata.json
* must have required property 'release_level' in google-cloud-api_gateway-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-api_gateway-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-api_gateway/.repo-metadata.json
* api_shortname field missing from google-cloud-api_gateway/.repo-metadata.json
* must have required property 'release_level' in google-cloud-apigee_connect-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-apigee_connect-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-apigee_connect/.repo-metadata.json
* api_shortname field missing from google-cloud-apigee_connect/.repo-metadata.json
* must have required property 'release_level' in google-cloud-app_engine-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-app_engine-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-app_engine/.repo-metadata.json
* api_shortname field missing from google-cloud-app_engine/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-artifact_registry/.repo-metadata.json
* api_shortname field missing from google-cloud-artifact_registry/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-asset-v1/.repo-metadata.json
* must have required property 'library_type' in google-cloud-asset-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-asset/.repo-metadata.json
* api_shortname field missing from google-cloud-asset/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-assured_workloads/.repo-metadata.json
* api_shortname field missing from google-cloud-assured_workloads/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-automl-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-automl-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-automl/.repo-metadata.json
* api_shortname field missing from google-cloud-automl/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-connection-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-connection-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-connection/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-connection/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-data_transfer-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-data_transfer-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-data_transfer/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-data_transfer/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-reservation-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-reservation-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-reservation/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-reservation/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-storage-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-storage-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery-storage/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery-storage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigquery/.repo-metadata.json
* api_shortname field missing from google-cloud-bigquery/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable-admin-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable-admin-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-bigtable/.repo-metadata.json
* api_shortname field missing from google-cloud-bigtable/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-budgets/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-budgets/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-billing-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-billing/.repo-metadata.json
* api_shortname field missing from google-cloud-billing/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-binary_authorization/.repo-metadata.json
* api_shortname field missing from google-cloud-binary_authorization/.repo-metadata.json
* must have required property 'release_level' in google-cloud-build-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-build-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-build/.repo-metadata.json
* api_shortname field missing from google-cloud-build/.repo-metadata.json
* must have required property 'release_level' in google-cloud-channel-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-channel-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-channel/.repo-metadata.json
* api_shortname field missing from google-cloud-channel/.repo-metadata.json
* must have required property 'release_level' in google-cloud-cloud_dms-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-cloud_dms-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-cloud_dms/.repo-metadata.json
* api_shortname field missing from google-cloud-cloud_dms/.repo-metadata.json
* must have required property 'release_level' in google-cloud-compute-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-compute-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-contact_center_insights-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-contact_center_insights-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-contact_center_insights/.repo-metadata.json
* api_shortname field missing from google-cloud-contact_center_insights/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-container-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-container-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container/.repo-metadata.json
* api_shortname field missing from google-cloud-container/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container_analysis-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-container_analysis-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-container_analysis/.repo-metadata.json
* api_shortname field missing from google-cloud-container_analysis/.repo-metadata.json
* must have required property 'release_level' in google-cloud-core/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_catalog-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_catalog-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_catalog/.repo-metadata.json
* api_shortname field missing from google-cloud-data_catalog/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_fusion-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_fusion-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_fusion/.repo-metadata.json
* api_shortname field missing from google-cloud-data_fusion/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_labeling-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-data_labeling-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-data_labeling/.repo-metadata.json
* api_shortname field missing from google-cloud-data_labeling/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataflow-v1beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-dataflow-v1beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataflow/.repo-metadata.json
* api_shortname field missing from google-cloud-dataflow/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataproc/.repo-metadata.json
* api_shortname field missing from google-cloud-dataproc/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataqna-v1alpha/.repo-metadata.json
* api_shortname field missing from google-cloud-dataqna-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dataqna/.repo-metadata.json
* api_shortname field missing from google-cloud-dataqna/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore-admin-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore-admin-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastore/.repo-metadata.json
* api_shortname field missing from google-cloud-datastore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastream-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-cloud-datastream-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-datastream/.repo-metadata.json
* api_shortname field missing from google-cloud-datastream/.repo-metadata.json
* must have required property 'release_level' in google-cloud-debugger-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-debugger-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-debugger/.repo-metadata.json
* must have required property 'release_level' in google-cloud-deploy-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-deploy-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-deploy/.repo-metadata.json
* api_shortname field missing from google-cloud-deploy/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-cx-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-cx-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-cx/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-cx/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dialogflow/.repo-metadata.json
* api_shortname field missing from google-cloud-dialogflow/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dlp-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-dlp-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dlp/.repo-metadata.json
* api_shortname field missing from google-cloud-dlp/.repo-metadata.json
* must have required property 'release_level' in google-cloud-dns/.repo-metadata.json
* api_shortname field missing from google-cloud-dns/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai-v1beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai-v1beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-document_ai/.repo-metadata.json
* api_shortname field missing from google-cloud-document_ai/.repo-metadata.json
* must have required property 'release_level' in google-cloud-domains-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-domains-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-domains/.repo-metadata.json
* api_shortname field missing from google-cloud-domains/.repo-metadata.json
* must have required property 'release_level' in google-cloud-error_reporting-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-error_reporting-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-error_reporting/.repo-metadata.json
* must have required property 'release_level' in google-cloud-errors/.repo-metadata.json
* must have required property 'release_level' in google-cloud-essential_contacts-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-essential_contacts-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-essential_contacts/.repo-metadata.json
* api_shortname field missing from google-cloud-essential_contacts/.repo-metadata.json
* must have required property 'release_level' in google-cloud-eventarc-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-eventarc-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-eventarc/.repo-metadata.json
* api_shortname field missing from google-cloud-eventarc/.repo-metadata.json
* must have required property 'release_level' in google-cloud-filestore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-filestore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-filestore/.repo-metadata.json
* api_shortname field missing from google-cloud-filestore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore-admin-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore-admin-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-firestore/.repo-metadata.json
* api_shortname field missing from google-cloud-firestore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-functions-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-functions-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-functions/.repo-metadata.json
* api_shortname field missing from google-cloud-functions/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gaming-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-gaming-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gaming/.repo-metadata.json
* api_shortname field missing from google-cloud-gaming/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_connect-gateway-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_connect-gateway/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_connect-gateway/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-gke_hub/.repo-metadata.json
* api_shortname field missing from google-cloud-gke_hub/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iap-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-iap-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iap/.repo-metadata.json
* api_shortname field missing from google-cloud-iap/.repo-metadata.json
* must have required property 'release_level' in google-cloud-ids-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-ids-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-ids/.repo-metadata.json
* api_shortname field missing from google-cloud-ids/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iot-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-iot-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-iot/.repo-metadata.json
* api_shortname field missing from google-cloud-iot/.repo-metadata.json
* must have required property 'release_level' in google-cloud-kms-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-kms-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-kms/.repo-metadata.json
* api_shortname field missing from google-cloud-kms/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-language-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-language-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-language/.repo-metadata.json
* api_shortname field missing from google-cloud-language/.repo-metadata.json
* must have required property 'release_level' in google-cloud-life_sciences-v2beta/.repo-metadata.json
* api_shortname field missing from google-cloud-life_sciences-v2beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-life_sciences/.repo-metadata.json
* api_shortname field missing from google-cloud-life_sciences/.repo-metadata.json
* must have required property 'release_level' in google-cloud-location/.repo-metadata.json
* api_shortname field missing from google-cloud-location/.repo-metadata.json
* must have required property 'release_level' in google-cloud-logging-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-logging-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-logging/.repo-metadata.json
* api_shortname field missing from google-cloud-logging/.repo-metadata.json
* must have required property 'release_level' in google-cloud-managed_identities-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-managed_identities-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-managed_identities/.repo-metadata.json
* api_shortname field missing from google-cloud-managed_identities/.repo-metadata.json
* must have required property 'release_level' in google-cloud-media_translation-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-media_translation-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-media_translation/.repo-metadata.json
* api_shortname field missing from google-cloud-media_translation/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-memcache/.repo-metadata.json
* api_shortname field missing from google-cloud-memcache/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-metastore/.repo-metadata.json
* api_shortname field missing from google-cloud-metastore/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-dashboard-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-dashboard-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-metrics_scope-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-monitoring/.repo-metadata.json
* api_shortname field missing from google-cloud-monitoring/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity-v1alpha1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity-v1alpha1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_connectivity/.repo-metadata.json
* api_shortname field missing from google-cloud-network_connectivity/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_management-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_management-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_management/.repo-metadata.json
* api_shortname field missing from google-cloud-network_management/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_security-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-network_security-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-network_security/.repo-metadata.json
* api_shortname field missing from google-cloud-network_security/.repo-metadata.json
* must have required property 'release_level' in google-cloud-notebooks-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-notebooks-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-notebooks/.repo-metadata.json
* api_shortname field missing from google-cloud-notebooks/.repo-metadata.json
* must have required property 'release_level' in google-cloud-orchestration-airflow-service-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-orchestration-airflow-service-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-orchestration-airflow-service/.repo-metadata.json
* api_shortname field missing from google-cloud-orchestration-airflow-service/.repo-metadata.json
* must have required property 'release_level' in google-cloud-org_policy-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-org_policy-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-org_policy/.repo-metadata.json
* api_shortname field missing from google-cloud-org_policy/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config-v1alpha/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config-v1alpha/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_config/.repo-metadata.json
* api_shortname field missing from google-cloud-os_config/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-os_login/.repo-metadata.json
* api_shortname field missing from google-cloud-os_login/.repo-metadata.json
* must have required property 'release_level' in google-cloud-phishing_protection-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-phishing_protection-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-phishing_protection/.repo-metadata.json
* api_shortname field missing from google-cloud-phishing_protection/.repo-metadata.json
* must have required property 'release_level' in google-cloud-policy_troubleshooter-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-policy_troubleshooter-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-policy_troubleshooter/.repo-metadata.json
* api_shortname field missing from google-cloud-policy_troubleshooter/.repo-metadata.json
* must have required property 'release_level' in google-cloud-private_catalog-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-private_catalog-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-private_catalog/.repo-metadata.json
* api_shortname field missing from google-cloud-private_catalog/.repo-metadata.json
* must have required property 'release_level' in google-cloud-profiler-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-profiler-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-profiler/.repo-metadata.json
* api_shortname field missing from google-cloud-profiler/.repo-metadata.json
* must have required property 'release_level' in google-cloud-pubsub-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-pubsub-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-pubsub/.repo-metadata.json
* api_shortname field missing from google-cloud-pubsub/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recaptcha_enterprise/.repo-metadata.json
* api_shortname field missing from google-cloud-recaptcha_enterprise/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommendation_engine-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-recommendation_engine-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommendation_engine/.repo-metadata.json
* api_shortname field missing from google-cloud-recommendation_engine/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommender-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-recommender-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-recommender/.repo-metadata.json
* api_shortname field missing from google-cloud-recommender/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-redis-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-redis-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-redis/.repo-metadata.json
* api_shortname field missing from google-cloud-redis/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_manager-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_manager-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_manager/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_manager/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_settings-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_settings-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-resource_settings/.repo-metadata.json
* api_shortname field missing from google-cloud-resource_settings/.repo-metadata.json
* must have required property 'release_level' in google-cloud-retail-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-retail-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-retail/.repo-metadata.json
* api_shortname field missing from google-cloud-retail/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-scheduler/.repo-metadata.json
* api_shortname field missing from google-cloud-scheduler/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-secret_manager/.repo-metadata.json
* api_shortname field missing from google-cloud-secret_manager/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security-private_ca/.repo-metadata.json
* api_shortname field missing from google-cloud-security-private_ca/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-security_center/.repo-metadata.json
* api_shortname field missing from google-cloud-security_center/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_control-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_control-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_control/.repo-metadata.json
* api_shortname field missing from google-cloud-service_control/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_directory/.repo-metadata.json
* api_shortname field missing from google-cloud-service_directory/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_management-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_management-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_management/.repo-metadata.json
* api_shortname field missing from google-cloud-service_management/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_usage-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-service_usage-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-service_usage/.repo-metadata.json
* api_shortname field missing from google-cloud-service_usage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-shell-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-shell-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-shell/.repo-metadata.json
* api_shortname field missing from google-cloud-shell/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-admin-database-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-admin-database-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-admin-instance-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-admin-instance-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-spanner/.repo-metadata.json
* api_shortname field missing from google-cloud-spanner/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-speech-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-speech-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-speech/.repo-metadata.json
* api_shortname field missing from google-cloud-speech/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage/.repo-metadata.json
* api_shortname field missing from google-cloud-storage/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage_transfer-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-storage_transfer-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-storage_transfer/.repo-metadata.json
* api_shortname field missing from google-cloud-storage_transfer/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent-v4/.repo-metadata.json
* api_shortname field missing from google-cloud-talent-v4/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent-v4beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-talent-v4beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-talent/.repo-metadata.json
* api_shortname field missing from google-cloud-talent/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks-v2beta3/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks-v2beta3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tasks/.repo-metadata.json
* api_shortname field missing from google-cloud-tasks/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-text_to_speech/.repo-metadata.json
* api_shortname field missing from google-cloud-text_to_speech/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tpu-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-tpu-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-tpu/.repo-metadata.json
* api_shortname field missing from google-cloud-tpu/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-trace-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-trace-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-trace/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate-v2/.repo-metadata.json
* api_shortname field missing from google-cloud-translate-v2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate-v3/.repo-metadata.json
* api_shortname field missing from google-cloud-translate-v3/.repo-metadata.json
* must have required property 'release_level' in google-cloud-translate/.repo-metadata.json
* api_shortname field missing from google-cloud-translate/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video-transcoder/.repo-metadata.json
* api_shortname field missing from google-cloud-video-transcoder/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1beta2/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1beta2/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence-v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-video_intelligence/.repo-metadata.json
* api_shortname field missing from google-cloud-video_intelligence/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1p3beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision-v1p4beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-vision-v1p4beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vision/.repo-metadata.json
* api_shortname field missing from google-cloud-vision/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vm_migration-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vm_migration-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vm_migration/.repo-metadata.json
* api_shortname field missing from google-cloud-vm_migration/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vpc_access-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-vpc_access-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-vpc_access/.repo-metadata.json
* api_shortname field missing from google-cloud-vpc_access/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk-v1beta1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk-v1beta1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_risk/.repo-metadata.json
* api_shortname field missing from google-cloud-web_risk/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-web_security_scanner/.repo-metadata.json
* api_shortname field missing from google-cloud-web_security_scanner/.repo-metadata.json
* must have required property 'release_level' in google-cloud-webrisk/.repo-metadata.json
* api_shortname field missing from google-cloud-webrisk/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-executions-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-executions-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-executions-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-executions-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-v1/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-v1/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows-v1beta/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-cloud-workflows/.repo-metadata.json
* api_shortname field missing from google-cloud-workflows/.repo-metadata.json
* must have required property 'library_type' in google-cloud/.repo-metadata.json
* must have required property 'release_level' in google-cloud/.repo-metadata.json
* must have required property 'release_level' in google-iam-credentials-v1/.repo-metadata.json
* api_shortname field missing from google-iam-credentials-v1/.repo-metadata.json
* must have required property 'release_level' in google-iam-credentials/.repo-metadata.json
* api_shortname field missing from google-iam-credentials/.repo-metadata.json
* must have required property 'release_level' in google-iam-v1beta/.repo-metadata.json
* api_shortname field missing from google-iam-v1beta/.repo-metadata.json
* must have required property 'release_level' in google-identity-access_context_manager-v1/.repo-metadata.json
* api_shortname field missing from google-identity-access_context_manager-v1/.repo-metadata.json
* must have required property 'release_level' in google-identity-access_context_manager/.repo-metadata.json
* api_shortname field missing from google-identity-access_context_manager/.repo-metadata.json
* must have required property 'library_type' in grafeas-client/.repo-metadata.json
* must have required property 'release_level' in grafeas-client/.repo-metadata.json
* must have required property 'release_level' in grafeas-v1/.repo-metadata.json
* api_shortname field missing from grafeas-v1/.repo-metadata.json
* must have required property 'release_level' in grafeas/.repo-metadata.json
* api_shortname field missing from grafeas/.repo-metadata.json
* must have required property 'library_type' in stackdriver-core/.repo-metadata.json
* must have required property 'release_level' in stackdriver-core/.repo-metadata.json
* must have required property 'library_type' in stackdriver/.repo-metadata.json
* must have required property 'release_level' in stackdriver/.repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 must have required property library type in gcloud repo metadata json must have required property release level in gcloud repo metadata json must have required property release level in google analytics admin repo metadata json api shortname field missing from google analytics admin repo metadata json must have required property release level in google analytics admin repo metadata json api shortname field missing from google analytics admin repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google analytics data repo metadata json api shortname field missing from google analytics data repo metadata json must have required property release level in google tables repo metadata json api shortname field missing from google tables repo metadata json must have required property release level in google tables repo metadata json api shortname field missing from google tables repo metadata json must have required property release level in google cloud access approval repo metadata json api shortname field missing from google cloud access approval repo metadata json must have required property release level in google cloud access approval repo metadata json api shortname field missing from google cloud access approval repo metadata json must have required property release level in google cloud api gateway repo metadata json api shortname field missing from google cloud api gateway repo metadata json must have required property release level in google cloud api gateway repo metadata json api shortname field missing from google cloud api gateway repo metadata json must have required property release level in google cloud apigee connect repo metadata json api shortname field missing from google cloud apigee connect repo metadata json must have required property release level in google cloud apigee connect repo metadata json api shortname field missing from google cloud apigee connect repo metadata json must have required property release level in google cloud app engine repo metadata json api shortname field missing from google cloud app engine repo metadata json must have required property release level in google cloud app engine repo metadata json api shortname field missing from google cloud app engine repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud artifact registry repo metadata json api shortname field missing from google cloud artifact registry repo metadata json must have required property release level in google cloud asset repo metadata json api shortname field missing from google cloud asset repo metadata json must have required property library type in google cloud asset repo metadata json must have required property release level in google cloud asset repo metadata json must have required property release level in google cloud asset repo metadata json api shortname field missing from google cloud asset repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud assured workloads repo metadata json api shortname field missing from google cloud assured workloads repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud automl repo metadata json api shortname field missing from google cloud automl repo metadata json must have required property release level in google cloud bigquery connection repo metadata json api shortname field missing from google cloud bigquery connection repo metadata json must have required property release level in google cloud bigquery connection repo metadata json api shortname field missing from google cloud bigquery connection repo metadata json must have required property release level in google cloud bigquery data transfer repo metadata json api shortname field missing from google cloud bigquery data transfer repo metadata json must have required property release level in google cloud bigquery data transfer repo metadata json api shortname field missing from google cloud bigquery data transfer repo metadata json must have required property release level in google cloud bigquery reservation repo metadata json api shortname field missing from google cloud bigquery reservation repo metadata json must have required property release level in google cloud bigquery reservation repo metadata json api shortname field missing from google cloud bigquery reservation repo metadata json must have required property release level in google cloud bigquery storage repo metadata json api shortname field missing from google cloud bigquery storage repo metadata json must have required property release level in google cloud bigquery storage repo metadata json api shortname field missing from google cloud bigquery storage repo metadata json must have required property release level in google cloud bigquery repo metadata json api shortname field missing from google cloud bigquery repo metadata json must have required property release level in google cloud bigtable admin repo metadata json api shortname field missing from google cloud bigtable admin repo metadata json must have required property release level in google cloud bigtable repo metadata json api shortname field missing from google cloud bigtable repo metadata json must have required property release level in google cloud bigtable repo metadata json api shortname field missing from google cloud bigtable repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing budgets repo metadata json api shortname field missing from google cloud billing budgets repo metadata json must have required property release level in google cloud billing repo metadata json api shortname field missing from google cloud billing repo metadata json must have required property release level in google cloud billing repo metadata json api shortname field missing from google cloud billing repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud binary authorization repo metadata json api shortname field missing from google cloud binary authorization repo metadata json must have required property release level in google cloud build repo metadata json api shortname field missing from google cloud build repo metadata json must have required property release level in google cloud build repo metadata json api shortname field missing from google cloud build repo metadata json must have required property release level in google cloud channel repo metadata json api shortname field missing from google cloud channel repo metadata json must have required property release level in google cloud channel repo metadata json api shortname field missing from google cloud channel repo metadata json must have required property release level in google cloud cloud dms repo metadata json api shortname field missing from google cloud cloud dms repo metadata json must have required property release level in google cloud cloud dms repo metadata json api shortname field missing from google cloud cloud dms repo metadata json must have required property release level in google cloud compute repo metadata json api shortname field missing from google cloud compute repo metadata json must have required property release level in google cloud contact center insights repo metadata json api shortname field missing from google cloud contact center insights repo metadata json must have required property release level in google cloud contact center insights repo metadata json api shortname field missing from google cloud contact center insights repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container repo metadata json api shortname field missing from google cloud container repo metadata json must have required property release level in google cloud container analysis repo metadata json api shortname field missing from google cloud container analysis repo metadata json must have required property release level in google cloud container analysis repo metadata json api shortname field missing from google cloud container analysis repo metadata json must have required property release level in google cloud core repo metadata json must have required property release level in google cloud data catalog repo metadata json api shortname field missing from google cloud data catalog repo metadata json must have required property release level in google cloud data catalog repo metadata json api shortname field missing from google cloud data catalog repo metadata json must have required property release level in google cloud data fusion repo metadata json api shortname field missing from google cloud data fusion repo metadata json must have required property release level in google cloud data fusion repo metadata json api shortname field missing from google cloud data fusion repo metadata json must have required property release level in google cloud data labeling repo metadata json api shortname field missing from google cloud data labeling repo metadata json must have required property release level in google cloud data labeling repo metadata json api shortname field missing from google cloud data labeling repo metadata json must have required property release level in google cloud dataflow repo metadata json api shortname field missing from google cloud dataflow repo metadata json must have required property release level in google cloud dataflow repo metadata json api shortname field missing from google cloud dataflow repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataproc repo metadata json api shortname field missing from google cloud dataproc repo metadata json must have required property release level in google cloud dataqna repo metadata json api shortname field missing from google cloud dataqna repo metadata json must have required property release level in google cloud dataqna repo metadata json api shortname field missing from google cloud dataqna repo metadata json must have required property release level in google cloud datastore admin repo metadata json api shortname field missing from google cloud datastore admin repo metadata json must have required property release level in google cloud datastore repo metadata json api shortname field missing from google cloud datastore repo metadata json must have required property release level in google cloud datastore repo metadata json api shortname field missing from google cloud datastore repo metadata json must have required property release level in google cloud datastream repo metadata json api shortname field missing from google cloud datastream repo metadata json must have required property release level in google cloud datastream repo metadata json api shortname field missing from google cloud datastream repo metadata json must have required property release level in google cloud debugger repo metadata json api shortname field missing from google cloud debugger repo metadata json must have required property release level in google cloud debugger repo metadata json must have required property release level in google cloud deploy repo metadata json api shortname field missing from google cloud deploy repo metadata json must have required property release level in google cloud deploy repo metadata json api shortname field missing from google cloud deploy repo metadata json must have required property release level in google cloud dialogflow cx repo metadata json api shortname field missing from google cloud dialogflow cx repo metadata json must have required property release level in google cloud dialogflow cx repo metadata json api shortname field missing from google cloud dialogflow cx repo metadata json must have required property release level in google cloud dialogflow repo metadata json api shortname field missing from google cloud dialogflow repo metadata json must have required property release level in google cloud dialogflow repo metadata json api shortname field missing from google cloud dialogflow repo metadata json must have required property release level in google cloud dlp repo metadata json api shortname field missing from google cloud dlp repo metadata json must have required property release level in google cloud dlp repo metadata json api shortname field missing from google cloud dlp repo metadata json must have required property release level in google cloud dns repo metadata json api shortname field missing from google cloud dns repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud document ai repo metadata json api shortname field missing from google cloud document ai repo metadata json must have required property release level in google cloud domains repo metadata json api shortname field missing from google cloud domains repo metadata json must have required property release level in google cloud domains repo metadata json api shortname field missing from google cloud domains repo metadata json must have required property release level in google cloud error reporting repo metadata json api shortname field missing from google cloud error reporting repo metadata json must have required property release level in google cloud error reporting repo metadata json must have required property release level in google cloud errors repo metadata json must have required property release level in google cloud essential contacts repo metadata json api shortname field missing from google cloud essential contacts repo metadata json must have required property release level in google cloud essential contacts repo metadata json api shortname field missing from google cloud essential contacts repo metadata json must have required property release level in google cloud eventarc repo metadata json api shortname field missing from google cloud eventarc repo metadata json must have required property release level in google cloud eventarc repo metadata json api shortname field missing from google cloud eventarc repo metadata json must have required property release level in google cloud filestore repo metadata json api shortname field missing from google cloud filestore repo metadata json must have required property release level in google cloud filestore repo metadata json api shortname field missing from google cloud filestore repo metadata json must have required property release level in google cloud firestore admin repo metadata json api shortname field missing from google cloud firestore admin repo metadata json must have required property release level in google cloud firestore repo metadata json api shortname field missing from google cloud firestore repo metadata json must have required property release level in google cloud firestore repo metadata json api shortname field missing from google cloud firestore repo metadata json must have required property release level in google cloud functions repo metadata json api shortname field missing from google cloud functions repo metadata json must have required property release level in google cloud functions repo metadata json api shortname field missing from google cloud functions repo metadata json must have required property release level in google cloud gaming repo metadata json api shortname field missing from google cloud gaming repo metadata json must have required property release level in google cloud gaming repo metadata json api shortname field missing from google cloud gaming repo metadata json must have required property release level in google cloud gke connect gateway repo metadata json api shortname field missing from google cloud gke connect gateway repo metadata json must have required property release level in google cloud gke connect gateway repo metadata json api shortname field missing from google cloud gke connect gateway repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud gke hub repo metadata json api shortname field missing from google cloud gke hub repo metadata json must have required property release level in google cloud iap repo metadata json api shortname field missing from google cloud iap repo metadata json must have required property release level in google cloud iap repo metadata json api shortname field missing from google cloud iap repo metadata json must have required property release level in google cloud ids repo metadata json api shortname field missing from google cloud ids repo metadata json must have required property release level in google cloud ids repo metadata json api shortname field missing from google cloud ids repo metadata json must have required property release level in google cloud iot repo metadata json api shortname field missing from google cloud iot repo metadata json must have required property release level in google cloud iot repo metadata json api shortname field missing from google cloud iot repo metadata json must have required property release level in google cloud kms repo metadata json api shortname field missing from google cloud kms repo metadata json must have required property release level in google cloud kms repo metadata json api shortname field missing from google cloud kms repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud language repo metadata json api shortname field missing from google cloud language repo metadata json must have required property release level in google cloud life sciences repo metadata json api shortname field missing from google cloud life sciences repo metadata json must have required property release level in google cloud life sciences repo metadata json api shortname field missing from google cloud life sciences repo metadata json must have required property release level in google cloud location repo metadata json api shortname field missing from google cloud location repo metadata json must have required property release level in google cloud logging repo metadata json api shortname field missing from google cloud logging repo metadata json must have required property release level in google cloud logging repo metadata json api shortname field missing from google cloud logging repo metadata json must have required property release level in google cloud managed identities repo metadata json api shortname field missing from google cloud managed identities repo metadata json must have required property release level in google cloud managed identities repo metadata json api shortname field missing from google cloud managed identities repo metadata json must have required property release level in google cloud media translation repo metadata json api shortname field missing from google cloud media translation repo metadata json must have required property release level in google cloud media translation repo metadata json api shortname field missing from google cloud media translation repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud memcache repo metadata json api shortname field missing from google cloud memcache repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud metastore repo metadata json api shortname field missing from google cloud metastore repo metadata json must have required property release level in google cloud monitoring dashboard repo metadata json api shortname field missing from google cloud monitoring dashboard repo metadata json must have required property release level in google cloud monitoring metrics scope repo metadata json api shortname field missing from google cloud monitoring metrics scope repo metadata json must have required property release level in google cloud monitoring repo metadata json api shortname field missing from google cloud monitoring repo metadata json must have required property release level in google cloud monitoring repo metadata json api shortname field missing from google cloud monitoring repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network connectivity repo metadata json api shortname field missing from google cloud network connectivity repo metadata json must have required property release level in google cloud network management repo metadata json api shortname field missing from google cloud network management repo metadata json must have required property release level in google cloud network management repo metadata json api shortname field missing from google cloud network management repo metadata json must have required property release level in google cloud network security repo metadata json api shortname field missing from google cloud network security repo metadata json must have required property release level in google cloud network security repo metadata json api shortname field missing from google cloud network security repo metadata json must have required property release level in google cloud notebooks repo metadata json api shortname field missing from google cloud notebooks repo metadata json must have required property release level in google cloud notebooks repo metadata json api shortname field missing from google cloud notebooks repo metadata json must have required property release level in google cloud orchestration airflow service repo metadata json api shortname field missing from google cloud orchestration airflow service repo metadata json must have required property release level in google cloud orchestration airflow service repo metadata json api shortname field missing from google cloud orchestration airflow service repo metadata json must have required property release level in google cloud org policy repo metadata json api shortname field missing from google cloud org policy repo metadata json must have required property release level in google cloud org policy repo metadata json api shortname field missing from google cloud org policy repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os config repo metadata json api shortname field missing from google cloud os config repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud os login repo metadata json api shortname field missing from google cloud os login repo metadata json must have required property release level in google cloud phishing protection repo metadata json api shortname field missing from google cloud phishing protection repo metadata json must have required property release level in google cloud phishing protection repo metadata json api shortname field missing from google cloud phishing protection repo metadata json must have required property release level in google cloud policy troubleshooter repo metadata json api shortname field missing from google cloud policy troubleshooter repo metadata json must have required property release level in google cloud policy troubleshooter repo metadata json api shortname field missing from google cloud policy troubleshooter repo metadata json must have required property release level in google cloud private catalog repo metadata json api shortname field missing from google cloud private catalog repo metadata json must have required property release level in google cloud private catalog repo metadata json api shortname field missing from google cloud private catalog repo metadata json must have required property release level in google cloud profiler repo metadata json api shortname field missing from google cloud profiler repo metadata json must have required property release level in google cloud profiler repo metadata json api shortname field missing from google cloud profiler repo metadata json must have required property release level in google cloud pubsub repo metadata json api shortname field missing from google cloud pubsub repo metadata json must have required property release level in google cloud pubsub repo metadata json api shortname field missing from google cloud pubsub repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recaptcha enterprise repo metadata json api shortname field missing from google cloud recaptcha enterprise repo metadata json must have required property release level in google cloud recommendation engine repo metadata json api shortname field missing from google cloud recommendation engine repo metadata json must have required property release level in google cloud recommendation engine repo metadata json api shortname field missing from google cloud recommendation engine repo metadata json must have required property release level in google cloud recommender repo metadata json api shortname field missing from google cloud recommender repo metadata json must have required property release level in google cloud recommender repo metadata json api shortname field missing from google cloud recommender repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud redis repo metadata json api shortname field missing from google cloud redis repo metadata json must have required property release level in google cloud resource manager repo metadata json api shortname field missing from google cloud resource manager repo metadata json must have required property release level in google cloud resource manager repo metadata json api shortname field missing from google cloud resource manager repo metadata json must have required property release level in google cloud resource settings repo metadata json api shortname field missing from google cloud resource settings repo metadata json must have required property release level in google cloud resource settings repo metadata json api shortname field missing from google cloud resource settings repo metadata json must have required property release level in google cloud retail repo metadata json api shortname field missing from google cloud retail repo metadata json must have required property release level in google cloud retail repo metadata json api shortname field missing from google cloud retail repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud scheduler repo metadata json api shortname field missing from google cloud scheduler repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud secret manager repo metadata json api shortname field missing from google cloud secret manager repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security private ca repo metadata json api shortname field missing from google cloud security private ca repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud security center repo metadata json api shortname field missing from google cloud security center repo metadata json must have required property release level in google cloud service control repo metadata json api shortname field missing from google cloud service control repo metadata json must have required property release level in google cloud service control repo metadata json api shortname field missing from google cloud service control repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service directory repo metadata json api shortname field missing from google cloud service directory repo metadata json must have required property release level in google cloud service management repo metadata json api shortname field missing from google cloud service management repo metadata json must have required property release level in google cloud service management repo metadata json api shortname field missing from google cloud service management repo metadata json must have required property release level in google cloud service usage repo metadata json api shortname field missing from google cloud service usage repo metadata json must have required property release level in google cloud service usage repo metadata json api shortname field missing from google cloud service usage repo metadata json must have required property release level in google cloud shell repo metadata json api shortname field missing from google cloud shell repo metadata json must have required property release level in google cloud shell repo metadata json api shortname field missing from google cloud shell repo metadata json must have required property release level in google cloud spanner admin database repo metadata json api shortname field missing from google cloud spanner admin database repo metadata json must have required property release level in google cloud spanner admin instance repo metadata json api shortname field missing from google cloud spanner admin instance repo metadata json must have required property release level in google cloud spanner repo metadata json api shortname field missing from google cloud spanner repo metadata json must have required property release level in google cloud spanner repo metadata json api shortname field missing from google cloud spanner repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud speech repo metadata json api shortname field missing from google cloud speech repo metadata json must have required property release level in google cloud storage repo metadata json api shortname field missing from google cloud storage repo metadata json must have required property release level in google cloud storage transfer repo metadata json api shortname field missing from google cloud storage transfer repo metadata json must have required property release level in google cloud storage transfer repo metadata json api shortname field missing from google cloud storage transfer repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud talent repo metadata json api shortname field missing from google cloud talent repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud tasks repo metadata json api shortname field missing from google cloud tasks repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud text to speech repo metadata json api shortname field missing from google cloud text to speech repo metadata json must have required property release level in google cloud tpu repo metadata json api shortname field missing from google cloud tpu repo metadata json must have required property release level in google cloud tpu repo metadata json api shortname field missing from google cloud tpu repo metadata json must have required property release level in google cloud trace repo metadata json api shortname field missing from google cloud trace repo metadata json must have required property release level in google cloud trace repo metadata json api shortname field missing from google cloud trace repo metadata json must have required property release level in google cloud trace repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud translate repo metadata json api shortname field missing from google cloud translate repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video transcoder repo metadata json api shortname field missing from google cloud video transcoder repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud video intelligence repo metadata json api shortname field missing from google cloud video intelligence repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vision repo metadata json api shortname field missing from google cloud vision repo metadata json must have required property release level in google cloud vm migration repo metadata json api shortname field missing from google cloud vm migration repo metadata json must have required property release level in google cloud vm migration repo metadata json api shortname field missing from google cloud vm migration repo metadata json must have required property release level in google cloud vpc access repo metadata json api shortname field missing from google cloud vpc access repo metadata json must have required property release level in google cloud vpc access repo metadata json api shortname field missing from google cloud vpc access repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web risk repo metadata json api shortname field missing from google cloud web risk repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud web security scanner repo metadata json api shortname field missing from google cloud web security scanner repo metadata json must have required property release level in google cloud webrisk repo metadata json api shortname field missing from google cloud webrisk repo metadata json must have required property release level in google cloud workflows executions repo metadata json api shortname field missing from google cloud workflows executions repo metadata json must have required property release level in google cloud workflows executions repo metadata json api shortname field missing from google cloud workflows executions repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property release level in google cloud workflows repo metadata json api shortname field missing from google cloud workflows repo metadata json must have required property library type in google cloud repo metadata json must have required property release level in google cloud repo metadata json must have required property release level in google iam credentials repo metadata json api shortname field missing from google iam credentials repo metadata json must have required property release level in google iam credentials repo metadata json api shortname field missing from google iam credentials repo metadata json must have required property release level in google iam repo metadata json api shortname field missing from google iam repo metadata json must have required property release level in google identity access context manager repo metadata json api shortname field missing from google identity access context manager repo metadata json must have required property release level in google identity access context manager repo metadata json api shortname field missing from google identity access context manager repo metadata json must have required property library type in grafeas client repo metadata json must have required property release level in grafeas client repo metadata json must have required property release level in grafeas repo metadata json api shortname field missing from grafeas repo metadata json must have required property release level in grafeas repo metadata json api shortname field missing from grafeas repo metadata json must have required property library type in stackdriver core repo metadata json must have required property release level in stackdriver core repo metadata json must have required property library type in stackdriver repo metadata json must have required property release level in stackdriver repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
| 1
|
17,801
| 23,728,272,985
|
IssuesEvent
|
2022-08-30 21:59:55
|
google/ground-android
|
https://api.github.com/repos/google/ground-android
|
closed
|
[UXR] Research open questions during pilot
|
type: process priority: p2
|
- [ ] Is context of view/add/edit observation clear (i.e. which layer and which point is selected), or do we need breadcrumbs, or some other affordance in the header, etc.?
- [ ] Feature Sheet: 1) Do we need to [un]lock the map when sheet is expanded/collapsed? 2) On zoom or pan, collapse the feature sheet. 3) When a different map feature is tapped, render the feature sheet with the details for that feature
|
1.0
|
[UXR] Research open questions during pilot - - [ ] Is context of view/add/edit observation clear (i.e. which layer and which point is selected), or do we need breadcrumbs, or some other affordance in the header, etc.?
- [ ] Feature Sheet: 1) Do we need to [un]lock the map when sheet is expanded/collapsed? 2) On zoom or pan, collapse the feature sheet. 3) When a different map feature is tapped, render the feature sheet with the details for that feature
|
process
|
research open questions during pilot is context of view add edit observation clear i e which layer and which point is selected or do we need breadcrumbs or some other affordance in the header etc feature sheet do we need to lock the map when sheet is expanded collapsed on zoom or pan collapse the feature sheet when a different map feature is tapped render the feature sheet with the details for that feature
| 1
|
8,740
| 11,868,427,815
|
IssuesEvent
|
2020-03-26 09:10:31
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[FEATURE][processing] Add new algorithm "Detect Dataset Changes"
|
3.12 Automatic new feature Processing Alg
|
Original commit: https://github.com/qgis/QGIS/commit/8c8d48bb8fe82ef8654dd65233e3265f4c3e524e by nyalldawson
This algorithm compares two vector layers, and determines which features
are unchanged, added or deleted between the two. It is designed for comparing
two different versions of the same dataset.
When comparing features, the original and revised feature geometries will be
compared against each other. Depending on the Geometry Comparison Behavior setting,
the comparison will either be made using an exact comparison (where geometries must
be an exact match for each other, including the order and count of vertices) or a
topological comparison only (where are geometries area considered equal if all of
the their component edges overlap. E.g. lines with the same vertex locations but
opposite direction will be considered equal by this method). If the topological
comparison is selected then any z or m values present in the geometries will not
be compared.
By default, the algorithm compares all attributes from the original and revised
features. If the Attributes to Consider for Match parameter is changed, then only
the selected attributes will be compared (e.g. allowing users to ignore a timestamp
or ID field which is expected to change between the revisions).
If any features in the original or revised layers do not have an associated geometry,
then care must be taken to ensure that these features have a unique set of
attributes selected for comparison. If this condition is not met, warnings will be
raised and the resultant outputs may be misleading.
The algorithm outputs three layers, one containing all features which are considered
to be unchanged between the revisions, one containing features deleted from the
original layer which are not present in the revised layer, and one containing features
add to the revised layer which are not present in the original layer.
|
1.0
|
[FEATURE][processing] Add new algorithm "Detect Dataset Changes" - Original commit: https://github.com/qgis/QGIS/commit/8c8d48bb8fe82ef8654dd65233e3265f4c3e524e by nyalldawson
This algorithm compares two vector layers, and determines which features
are unchanged, added or deleted between the two. It is designed for comparing
two different versions of the same dataset.
When comparing features, the original and revised feature geometries will be
compared against each other. Depending on the Geometry Comparison Behavior setting,
the comparison will either be made using an exact comparison (where geometries must
be an exact match for each other, including the order and count of vertices) or a
topological comparison only (where are geometries area considered equal if all of
the their component edges overlap. E.g. lines with the same vertex locations but
opposite direction will be considered equal by this method). If the topological
comparison is selected then any z or m values present in the geometries will not
be compared.
By default, the algorithm compares all attributes from the original and revised
features. If the Attributes to Consider for Match parameter is changed, then only
the selected attributes will be compared (e.g. allowing users to ignore a timestamp
or ID field which is expected to change between the revisions).
If any features in the original or revised layers do not have an associated geometry,
then care must be taken to ensure that these features have a unique set of
attributes selected for comparison. If this condition is not met, warnings will be
raised and the resultant outputs may be misleading.
The algorithm outputs three layers, one containing all features which are considered
to be unchanged between the revisions, one containing features deleted from the
original layer which are not present in the revised layer, and one containing features
add to the revised layer which are not present in the original layer.
|
process
|
add new algorithm detect dataset changes original commit by nyalldawson this algorithm compares two vector layers and determines which features are unchanged added or deleted between the two it is designed for comparing two different versions of the same dataset when comparing features the original and revised feature geometries will be compared against each other depending on the geometry comparison behavior setting the comparison will either be made using an exact comparison where geometries must be an exact match for each other including the order and count of vertices or a topological comparison only where are geometries area considered equal if all of the their component edges overlap e g lines with the same vertex locations but opposite direction will be considered equal by this method if the topological comparison is selected then any z or m values present in the geometries will not be compared by default the algorithm compares all attributes from the original and revised features if the attributes to consider for match parameter is changed then only the selected attributes will be compared e g allowing users to ignore a timestamp or id field which is expected to change between the revisions if any features in the original or revised layers do not have an associated geometry then care must be taken to ensure that these features have a unique set of attributes selected for comparison if this condition is not met warnings will be raised and the resultant outputs may be misleading the algorithm outputs three layers one containing all features which are considered to be unchanged between the revisions one containing features deleted from the original layer which are not present in the revised layer and one containing features add to the revised layer which are not present in the original layer
| 1
|
143,519
| 5,518,204,870
|
IssuesEvent
|
2017-03-18 06:21:31
|
Kahraymer/Runner-Game
|
https://api.github.com/repos/Kahraymer/Runner-Game
|
closed
|
Linking Levels Together/World Map Screen
|
low priority
|
Learn how to link levels together, we may need a world map for the player to navigate between levels
|
1.0
|
Linking Levels Together/World Map Screen - Learn how to link levels together, we may need a world map for the player to navigate between levels
|
non_process
|
linking levels together world map screen learn how to link levels together we may need a world map for the player to navigate between levels
| 0
|
2,118
| 4,955,641,986
|
IssuesEvent
|
2016-12-01 21:01:18
|
sysown/proxysql
|
https://api.github.com/repos/sysown/proxysql
|
opened
|
Handling of "set autocommit" optional
|
ADMIN CONNECTION POOL PROTOCOL QUERY PROCESSOR ROUTING
|
ProxySQL internally handles `set autocommit` queries, and it is also able to understand if `rollback` or `commit` queries can be filtered (for example, `rollback` and `commit` can be filtered if there is not a running transaction).
This functionality is very important for driver that executes `set autocommit=0` when it connects, without even specifying with schema it wants to use (namely, Python drivers) and there are multiple clusters connected.
Although this is very useful in advanced setup (sharded) it makes complicated the setup of safe read/write split in simple setups with just one master.
For this reason, the advanced handling of `set autocommit` , `commit` and `rollback` should be optional.
|
1.0
|
Handling of "set autocommit" optional - ProxySQL internally handles `set autocommit` queries, and it is also able to understand if `rollback` or `commit` queries can be filtered (for example, `rollback` and `commit` can be filtered if there is not a running transaction).
This functionality is very important for driver that executes `set autocommit=0` when it connects, without even specifying with schema it wants to use (namely, Python drivers) and there are multiple clusters connected.
Although this is very useful in advanced setup (sharded) it makes complicated the setup of safe read/write split in simple setups with just one master.
For this reason, the advanced handling of `set autocommit` , `commit` and `rollback` should be optional.
|
process
|
handling of set autocommit optional proxysql internally handles set autocommit queries and it is also able to understand if rollback or commit queries can be filtered for example rollback and commit can be filtered if there is not a running transaction this functionality is very important for driver that executes set autocommit when it connects without even specifying with schema it wants to use namely python drivers and there are multiple clusters connected although this is very useful in advanced setup sharded it makes complicated the setup of safe read write split in simple setups with just one master for this reason the advanced handling of set autocommit commit and rollback should be optional
| 1
|
131,535
| 18,298,626,393
|
IssuesEvent
|
2021-10-05 23:22:47
|
apcountryman/prototype-kicad-project-ci-cd
|
https://api.github.com/repos/apcountryman/prototype-kicad-project-ci-cd
|
closed
|
Design interactive BOM(s) generation CI job
|
priority-normal status-awaiting_approval type-feature_design
|
**Feature Design**
Add interactive HTML bill of materials (BOM) generation CI script (`ci/generate-interactive-boms`) which will create a directory (`bom`) within the project repository, if it does not already exist. The directory should be the target for the [InteractiveHtmlBom generate_interactive_bom script's](https://github.com/openscopeproject/InteractiveHtmlBom/wiki/Usage#standalone-script) output.
**Feature Use Case**
This CI job will run automatically as a part of pull requests, releases, etc... and will produce a bill of materials for all printed circuit boards contained within a project repository.
**Detailed Design**
- [ ] Add [InteractiveHtmlBom](https://github.com/openscopeproject/InteractiveHtmlBom) as a submodule to this project:
```bash
git submodule add ../InteractiveHtmlBom utilities/InteractiveHtmlBom
```
- [ ] Add `ci/generate-interactive-boms` to project:
- [ ] Supported options:
- [ ] `--help`: See other project scripts
- [ ] `--version`: See other project scripts
- [ ] `function error()`: See other project scripts
- [ ] `function abort()`: See other project scripts
- [ ] `function display_help_text()`: See other project scripts
- [ ] `function display_version()`: See other project scripts
- [ ] `function main()`: See other project scripts
- [ ] `function generate_interactive_boms()`:
```bash
local layouts; mapfile -t layouts < <( git -C "$repository" ls-files '*.kicad_pcb' | xargs -r -d '\n' -I '{}' find "$repository/{}" ); readonly layouts
for layout in "${layouts[@]}"; do
if ! "$repository/utilities/InteractiveHtmlBom/InteractiveHtmlBom/generate_interactive_bom.py" "$layout"; then
abort
fi
done
```
- [ ] Add interactive BOM(s) generation (and archiving) CI job to `.github/workflows/ci.yml`:
```yaml
generate-interactive-boms:
name: Generate interactive BOM(s)
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
with:
submodules: recursive
- name: Generate interactive BOM(s)
shell: bash
run: ./ci/generate-interactive-boms
- name: Archive generated interactive BOM(s)
uses: actions/upload-artifact@v2
with:
name: interactive-boms
path: interactive-boms
retention-days: 7
```
|
1.0
|
Design interactive BOM(s) generation CI job - **Feature Design**
Add interactive HTML bill of materials (BOM) generation CI script (`ci/generate-interactive-boms`) which will create a directory (`bom`) within the project repository, if it does not already exist. The directory should be the target for the [InteractiveHtmlBom generate_interactive_bom script's](https://github.com/openscopeproject/InteractiveHtmlBom/wiki/Usage#standalone-script) output.
**Feature Use Case**
This CI job will run automatically as a part of pull requests, releases, etc... and will produce a bill of materials for all printed circuit boards contained within a project repository.
**Detailed Design**
- [ ] Add [InteractiveHtmlBom](https://github.com/openscopeproject/InteractiveHtmlBom) as a submodule to this project:
```bash
git submodule add ../InteractiveHtmlBom utilities/InteractiveHtmlBom
```
- [ ] Add `ci/generate-interactive-boms` to project:
- [ ] Supported options:
- [ ] `--help`: See other project scripts
- [ ] `--version`: See other project scripts
- [ ] `function error()`: See other project scripts
- [ ] `function abort()`: See other project scripts
- [ ] `function display_help_text()`: See other project scripts
- [ ] `function display_version()`: See other project scripts
- [ ] `function main()`: See other project scripts
- [ ] `function generate_interactive_boms()`:
```bash
local layouts; mapfile -t layouts < <( git -C "$repository" ls-files '*.kicad_pcb' | xargs -r -d '\n' -I '{}' find "$repository/{}" ); readonly layouts
for layout in "${layouts[@]}"; do
if ! "$repository/utilities/InteractiveHtmlBom/InteractiveHtmlBom/generate_interactive_bom.py" "$layout"; then
abort
fi
done
```
- [ ] Add interactive BOM(s) generation (and archiving) CI job to `.github/workflows/ci.yml`:
```yaml
generate-interactive-boms:
name: Generate interactive BOM(s)
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
with:
submodules: recursive
- name: Generate interactive BOM(s)
shell: bash
run: ./ci/generate-interactive-boms
- name: Archive generated interactive BOM(s)
uses: actions/upload-artifact@v2
with:
name: interactive-boms
path: interactive-boms
retention-days: 7
```
|
non_process
|
design interactive bom s generation ci job feature design add interactive html bill of materials bom generation ci script ci generate interactive boms which will create a directory bom within the project repository if it does not already exist the directory should be the target for the output feature use case this ci job will run automatically as a part of pull requests releases etc and will produce a bill of materials for all printed circuit boards contained within a project repository detailed design add as a submodule to this project bash git submodule add interactivehtmlbom utilities interactivehtmlbom add ci generate interactive boms to project supported options help see other project scripts version see other project scripts function error see other project scripts function abort see other project scripts function display help text see other project scripts function display version see other project scripts function main see other project scripts function generate interactive boms bash local layouts mapfile t layouts git c repository ls files kicad pcb xargs r d n i find repository readonly layouts for layout in layouts do if repository utilities interactivehtmlbom interactivehtmlbom generate interactive bom py layout then abort fi done add interactive bom s generation and archiving ci job to github workflows ci yml yaml generate interactive boms name generate interactive bom s runs on ubuntu steps uses actions checkout with submodules recursive name generate interactive bom s shell bash run ci generate interactive boms name archive generated interactive bom s uses actions upload artifact with name interactive boms path interactive boms retention days
| 0
|
124,134
| 16,584,226,314
|
IssuesEvent
|
2021-05-31 16:01:43
|
gitpod-io/gitpod
|
https://api.github.com/repos/gitpod-io/gitpod
|
opened
|
Allow adding and switching teams
|
needs visual design 💄 roadmap item: teams & projects
|
Users shall be able to create teams. And switch the app context between their account and teams on the top left of the dashboard. A team consists of a name, that needs to be unique within the db (including user names) and consists of URL fragment characters. We also need a blacklist of names that we don't allow because they conflict with pages in the dashboard (e.g. settings, workspaces, admin, etc.).
A team is reachable by its URL which is `https://gitpod.io/<teamname>`
|
1.0
|
Allow adding and switching teams - Users shall be able to create teams. And switch the app context between their account and teams on the top left of the dashboard. A team consists of a name, that needs to be unique within the db (including user names) and consists of URL fragment characters. We also need a blacklist of names that we don't allow because they conflict with pages in the dashboard (e.g. settings, workspaces, admin, etc.).
A team is reachable by its URL which is `https://gitpod.io/<teamname>`
|
non_process
|
allow adding and switching teams users shall be able to create teams and switch the app context between their account and teams on the top left of the dashboard a team consists of a name that needs to be unique within the db including user names and consists of url fragment characters we also need a blacklist of names that we don t allow because they conflict with pages in the dashboard e g settings workspaces admin etc a team is reachable by its url which is
| 0
|
823,396
| 31,018,502,968
|
IssuesEvent
|
2023-08-10 01:57:43
|
wso2/api-manager
|
https://api.github.com/repos/wso2/api-manager
|
closed
|
API Configurations have no Message Mediation tag
|
Type/Question Priority/Normal Affected/APIM-4.1.0
|
### Description
I need the Message Mediation feature, but I don't see the Message Mediation option in the API Configurations, how can I enable it?

### Affected Component
APIM
### Version
4.1.0
### Environment Details (with versions)
windows 11
jdk 1.8
### Related Issues
_No response_
### Suggested Labels
_No response_
|
1.0
|
API Configurations have no Message Mediation tag - ### Description
I need the Message Mediation feature, but I don't see the Message Mediation option in the API Configurations, how can I enable it?

### Affected Component
APIM
### Version
4.1.0
### Environment Details (with versions)
windows 11
jdk 1.8
### Related Issues
_No response_
### Suggested Labels
_No response_
|
non_process
|
api configurations have no message mediation tag description i need the message mediation feature but i don t see the message mediation option in the api configurations how can i enable it affected component apim version environment details with versions windows jdk related issues no response suggested labels no response
| 0
|
9,565
| 8,670,808,405
|
IssuesEvent
|
2018-11-29 17:23:18
|
dequelabs/Deque-University-for-Android
|
https://api.github.com/repos/dequelabs/Deque-University-for-Android
|
closed
|
Fetching the wrong screenshot.
|
Attest Incomplete Service
|
The issue is that as a part of the automated journey to uncover accessibility violations; we call the attest api for each pages in the journey. After the api call we are accessing /api/results to get the json response to create html report associated with the page. At the same time we are accessing /api/screenshot.html to map to the html report created from the json. So for the journey of four pages we get the same screenshot which is for the first page of the journey. I hope you understood the issue; happy to call and explain.
|
1.0
|
Fetching the wrong screenshot. - The issue is that as a part of the automated journey to uncover accessibility violations; we call the attest api for each pages in the journey. After the api call we are accessing /api/results to get the json response to create html report associated with the page. At the same time we are accessing /api/screenshot.html to map to the html report created from the json. So for the journey of four pages we get the same screenshot which is for the first page of the journey. I hope you understood the issue; happy to call and explain.
|
non_process
|
fetching the wrong screenshot the issue is that as a part of the automated journey to uncover accessibility violations we call the attest api for each pages in the journey after the api call we are accessing api results to get the json response to create html report associated with the page at the same time we are accessing api screenshot html to map to the html report created from the json so for the journey of four pages we get the same screenshot which is for the first page of the journey i hope you understood the issue happy to call and explain
| 0
|
7,700
| 10,788,905,957
|
IssuesEvent
|
2019-11-05 10:43:39
|
home-assistant/home-assistant
|
https://api.github.com/repos/home-assistant/home-assistant
|
closed
|
Bounding box covers text annotation on some images
|
integration: image_processing
|
**Home Assistant release with the issue:**
0.101.0.dev0
**Integration:**
image_processing
**Description of problem:**
[I recently refactored](https://github.com/home-assistant/home-assistant/pull/26712) the image_processing.draw_box function but introduced an error. The box itself will sometimes cover the text annotation on some images, this is due to.
```python3
if text:
draw.text((left + line_width, abs(top - line_width)), text, fill=color)
```
Previously we had `top - 15` but now have `top - line_width` where the `line_width` is hard coded as 5 pixels. I think the issue is compounded by the fact that the box coordinates loose precision due to both down-sampling by the image_processing integrations, and rounding of the box coordinates (I've been rounding to 3 places but this is not enough). Example below using `top - 15` and increased rounding to 5 places, appears ok but if the box were at the top of the image then the text would be outside of the image.

Propose changing to `top - text_off` where `text_off = 15`. Any other suggestions? I'm a bit wary of hard coded pixel sizes as the visual effect will then depend on the size of the image. Also the above issue about text disappearing if box is at top of image.
A related issue I did not find a solution for, is that the text size is quite small. As far as I can tell from PIL docs the solution would be to [specify a font](https://pillow.readthedocs.io/en/3.1.x/reference/ImageDraw.html?highlight=text#PIL.ImageDraw.PIL.ImageDraw.Draw.text).
Tag @snowzach
|
1.0
|
Bounding box covers text annotation on some images - **Home Assistant release with the issue:**
0.101.0.dev0
**Integration:**
image_processing
**Description of problem:**
[I recently refactored](https://github.com/home-assistant/home-assistant/pull/26712) the image_processing.draw_box function but introduced an error. The box itself will sometimes cover the text annotation on some images, this is due to.
```python3
if text:
draw.text((left + line_width, abs(top - line_width)), text, fill=color)
```
Previously we had `top - 15` but now have `top - line_width` where the `line_width` is hard coded as 5 pixels. I think the issue is compounded by the fact that the box coordinates loose precision due to both down-sampling by the image_processing integrations, and rounding of the box coordinates (I've been rounding to 3 places but this is not enough). Example below using `top - 15` and increased rounding to 5 places, appears ok but if the box were at the top of the image then the text would be outside of the image.

Propose changing to `top - text_off` where `text_off = 15`. Any other suggestions? I'm a bit wary of hard coded pixel sizes as the visual effect will then depend on the size of the image. Also the above issue about text disappearing if box is at top of image.
A related issue I did not find a solution for, is that the text size is quite small. As far as I can tell from PIL docs the solution would be to [specify a font](https://pillow.readthedocs.io/en/3.1.x/reference/ImageDraw.html?highlight=text#PIL.ImageDraw.PIL.ImageDraw.Draw.text).
Tag @snowzach
|
process
|
bounding box covers text annotation on some images home assistant release with the issue integration image processing description of problem the image processing draw box function but introduced an error the box itself will sometimes cover the text annotation on some images this is due to if text draw text left line width abs top line width text fill color previously we had top but now have top line width where the line width is hard coded as pixels i think the issue is compounded by the fact that the box coordinates loose precision due to both down sampling by the image processing integrations and rounding of the box coordinates i ve been rounding to places but this is not enough example below using top and increased rounding to places appears ok but if the box were at the top of the image then the text would be outside of the image propose changing to top text off where text off any other suggestions i m a bit wary of hard coded pixel sizes as the visual effect will then depend on the size of the image also the above issue about text disappearing if box is at top of image a related issue i did not find a solution for is that the text size is quite small as far as i can tell from pil docs the solution would be to tag snowzach
| 1
|
9,210
| 12,239,964,833
|
IssuesEvent
|
2020-05-04 22:54:26
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
opened
|
Refactor long class lengths
|
process
|
Currently Rubocop is configured to ignore class lengths > 240 lines for the following files:
- document.rb
- site.rb
- commands/serve.rb
- configuration.rb
It should come as no surprise that I hate every time I have to wade into one of those files. :)
Definitely need to refactor those into multiple concerns and/or supporting POROs, starting with Site.
|
1.0
|
Refactor long class lengths - Currently Rubocop is configured to ignore class lengths > 240 lines for the following files:
- document.rb
- site.rb
- commands/serve.rb
- configuration.rb
It should come as no surprise that I hate every time I have to wade into one of those files. :)
Definitely need to refactor those into multiple concerns and/or supporting POROs, starting with Site.
|
process
|
refactor long class lengths currently rubocop is configured to ignore class lengths lines for the following files document rb site rb commands serve rb configuration rb it should come as no surprise that i hate every time i have to wade into one of those files definitely need to refactor those into multiple concerns and or supporting poros starting with site
| 1
|
155,308
| 19,784,625,355
|
IssuesEvent
|
2022-01-18 04:15:11
|
Seagate/cortx-rgw
|
https://api.github.com/repos/Seagate/cortx-rgw
|
opened
|
CVE-2022-0122 (Medium) detected in node-forge-0.10.0.tgz
|
security vulnerability
|
## CVE-2022-0122 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /src/pybind/mgr/dashboard/frontend/package.json</p>
<p>Path to vulnerable library: /src/pybind/mgr/dashboard/frontend/node_modules/node-forge/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1102.14.tgz (Root Library)
- webpack-dev-server-3.11.2.tgz
- selfsigned-1.10.11.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Seagate/cortx-rgw/commit/aa78617d024ccd26801e43c6980f939cf8bded5f">aa78617d024ccd26801e43c6980f939cf8bded5f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
forge is vulnerable to URL Redirection to Untrusted Site
<p>Publish Date: 2022-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0122>CVE-2022-0122</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p>
<p>Release Date: 2022-01-06</p>
<p>Fix Resolution: node-forge - 1.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-forge","packageVersion":"0.10.0","packageFilePaths":["/src/pybind/mgr/dashboard/frontend/package.json"],"isTransitiveDependency":true,"dependencyTree":"@angular-devkit/build-angular:0.1102.14;webpack-dev-server:3.11.2;selfsigned:1.10.11;node-forge:0.10.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"node-forge - 1.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-0122","vulnerabilityDetails":"forge is vulnerable to URL Redirection to Untrusted Site","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0122","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2022-0122 (Medium) detected in node-forge-0.10.0.tgz - ## CVE-2022-0122 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /src/pybind/mgr/dashboard/frontend/package.json</p>
<p>Path to vulnerable library: /src/pybind/mgr/dashboard/frontend/node_modules/node-forge/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1102.14.tgz (Root Library)
- webpack-dev-server-3.11.2.tgz
- selfsigned-1.10.11.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Seagate/cortx-rgw/commit/aa78617d024ccd26801e43c6980f939cf8bded5f">aa78617d024ccd26801e43c6980f939cf8bded5f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
forge is vulnerable to URL Redirection to Untrusted Site
<p>Publish Date: 2022-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0122>CVE-2022-0122</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p>
<p>Release Date: 2022-01-06</p>
<p>Fix Resolution: node-forge - 1.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-forge","packageVersion":"0.10.0","packageFilePaths":["/src/pybind/mgr/dashboard/frontend/package.json"],"isTransitiveDependency":true,"dependencyTree":"@angular-devkit/build-angular:0.1102.14;webpack-dev-server:3.11.2;selfsigned:1.10.11;node-forge:0.10.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"node-forge - 1.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-0122","vulnerabilityDetails":"forge is vulnerable to URL Redirection to Untrusted Site","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0122","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in node forge tgz cve medium severity vulnerability vulnerable library node forge tgz javascript implementations of network transports cryptography ciphers pki message digests and various utilities library home page a href path to dependency file src pybind mgr dashboard frontend package json path to vulnerable library src pybind mgr dashboard frontend node modules node forge package json dependency hierarchy build angular tgz root library webpack dev server tgz selfsigned tgz x node forge tgz vulnerable library found in head commit a href found in base branch master vulnerability details forge is vulnerable to url redirection to untrusted site publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node forge isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree angular devkit build angular webpack dev server selfsigned node forge isminimumfixversionavailable true minimumfixversion node forge isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails forge is vulnerable to url redirection to untrusted site vulnerabilityurl
| 0
|
43,952
| 9,526,234,453
|
IssuesEvent
|
2019-04-28 18:23:59
|
zowe/zlux
|
https://api.github.com/repos/zowe/zlux
|
closed
|
Support for deleting files in Code Editor
|
Code Editor help wanted
|
There's no way to delete a file in the editor currently.
|
1.0
|
Support for deleting files in Code Editor - There's no way to delete a file in the editor currently.
|
non_process
|
support for deleting files in code editor there s no way to delete a file in the editor currently
| 0
|
11,223
| 14,003,965,814
|
IssuesEvent
|
2020-10-28 16:30:41
|
pacificclimate/climate-explorer-data-prep
|
https://api.github.com/repos/pacificclimate/climate-explorer-data-prep
|
opened
|
use all-Canada data for p2a
|
process new data update existing data
|
Plan2Adapt has two derived datasets that are not yet using all-Canada data:
* Historical PRSN is BC only (made from ANUSPLIN tasmax, tasmin, and pr)
* Projected FFD is BC only (made from CLIMDEX frost free days)
They will need to be re-created from all-Canada data to match the rest of plan2adapt's data.
|
1.0
|
use all-Canada data for p2a - Plan2Adapt has two derived datasets that are not yet using all-Canada data:
* Historical PRSN is BC only (made from ANUSPLIN tasmax, tasmin, and pr)
* Projected FFD is BC only (made from CLIMDEX frost free days)
They will need to be re-created from all-Canada data to match the rest of plan2adapt's data.
|
process
|
use all canada data for has two derived datasets that are not yet using all canada data historical prsn is bc only made from anusplin tasmax tasmin and pr projected ffd is bc only made from climdex frost free days they will need to be re created from all canada data to match the rest of s data
| 1
|
344,085
| 24,797,392,878
|
IssuesEvent
|
2022-10-24 18:30:30
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
opened
|
docs: switch from `master` to `main` when relevant in the documentation
|
documentation
|
<!-- Thanks for taking the time to open an issue and help us make Bridgetown better! -->
## Motivation
<!-- Why should we update our docs? -->
As it is specified multiple time in the CHANGELOG, Github and GitLab are using now `main` as the name of the default branch. However, I see numerous times that the `master` branch is still quoted (as instance in deployment documentation : https://www.bridgetownrb.com/docs/deployment)
## Suggestion
Replace with `main` whenever possible. I can make a PR about it, since it doesn't seem to be huge as a change.
<!-- Thanks for taking the time to open an issue and help us make Bridgetown better! -->
|
1.0
|
docs: switch from `master` to `main` when relevant in the documentation - <!-- Thanks for taking the time to open an issue and help us make Bridgetown better! -->
## Motivation
<!-- Why should we update our docs? -->
As it is specified multiple time in the CHANGELOG, Github and GitLab are using now `main` as the name of the default branch. However, I see numerous times that the `master` branch is still quoted (as instance in deployment documentation : https://www.bridgetownrb.com/docs/deployment)
## Suggestion
Replace with `main` whenever possible. I can make a PR about it, since it doesn't seem to be huge as a change.
<!-- Thanks for taking the time to open an issue and help us make Bridgetown better! -->
|
non_process
|
docs switch from master to main when relevant in the documentation motivation as it is specified multiple time in the changelog github and gitlab are using now main as the name of the default branch however i see numerous times that the master branch is still quoted as instance in deployment documentation suggestion replace with main whenever possible i can make a pr about it since it doesn t seem to be huge as a change
| 0
|
15,438
| 19,652,978,584
|
IssuesEvent
|
2022-01-10 09:31:21
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
reopened
|
Fingerprinting queries can be very heavy on some databases
|
Type:Bug Priority:P1 .Performance Querying/Processor Administration/Metadata & Sync
|
Hi community, we had recently deployed Metabase in our org infrastructure, but we noticed that when we onboarded bigquery tables on Metabase, it had fired some really expensive queries and that ultimately resulted in rising in our bigquery cost.
Below is the sample query:
```
-- Metabase
SELECT `source`.`substring697666` AS `substring697666`, substring(`backuptxns`.`hashAadhaar`, 1, 1234) AS `substring697694`, `backuptxns`.`request_date_time` AS `request_date_time`, substring(`backuptxns`.`uidi_ref_no`, 1, 1234) AS `substring697695` FROM `AEPS`.`backuptxns`) `source` LIMIT 10000
```
So it had queried the table and limited it by 10000, but the above table contains 20 million records so basically bigquery scans all the tables, then it limits it. So, the overall process had queried around 500 GB of data.
We are using the following third-party driver:
https://github.com/MarkMacArdle/metabase-bigquery/releases
We are using the below Metabase version:
v1.41.4
Suggest a way to optimize the queries triggered by Metabase.
|
1.0
|
Fingerprinting queries can be very heavy on some databases - Hi community, we had recently deployed Metabase in our org infrastructure, but we noticed that when we onboarded bigquery tables on Metabase, it had fired some really expensive queries and that ultimately resulted in rising in our bigquery cost.
Below is the sample query:
```
-- Metabase
SELECT `source`.`substring697666` AS `substring697666`, substring(`backuptxns`.`hashAadhaar`, 1, 1234) AS `substring697694`, `backuptxns`.`request_date_time` AS `request_date_time`, substring(`backuptxns`.`uidi_ref_no`, 1, 1234) AS `substring697695` FROM `AEPS`.`backuptxns`) `source` LIMIT 10000
```
So it had queried the table and limited it by 10000, but the above table contains 20 million records so basically bigquery scans all the tables, then it limits it. So, the overall process had queried around 500 GB of data.
We are using the following third-party driver:
https://github.com/MarkMacArdle/metabase-bigquery/releases
We are using the below Metabase version:
v1.41.4
Suggest a way to optimize the queries triggered by Metabase.
|
process
|
fingerprinting queries can be very heavy on some databases hi community we had recently deployed metabase in our org infrastructure but we noticed that when we onboarded bigquery tables on metabase it had fired some really expensive queries and that ultimately resulted in rising in our bigquery cost below is the sample query metabase select source as substring backuptxns hashaadhaar as backuptxns request date time as request date time substring backuptxns uidi ref no as from aeps backuptxns source limit so it had queried the table and limited it by but the above table contains million records so basically bigquery scans all the tables then it limits it so the overall process had queried around gb of data we are using the following third party driver we are using the below metabase version suggest a way to optimize the queries triggered by metabase
| 1
|
4,437
| 7,311,819,791
|
IssuesEvent
|
2018-02-28 18:59:27
|
kvakulo/Switcheroo
|
https://api.github.com/repos/kvakulo/Switcheroo
|
closed
|
Alt+Tab integration doesn't work for administrator owned processes
|
enhancement in process
|
1. Turn on Alt+Tab integration in Switcheroo
2. Open e.g. `cmd` with `Run as administrator`
3. Press `Alt+Tab` while having the cmd window in focus
4. The usual Windows Alt+Tab functionality appears, and not Switcheroo
Is the solution to run Switcheroo as administrator? Or is there another way?
|
1.0
|
Alt+Tab integration doesn't work for administrator owned processes - 1. Turn on Alt+Tab integration in Switcheroo
2. Open e.g. `cmd` with `Run as administrator`
3. Press `Alt+Tab` while having the cmd window in focus
4. The usual Windows Alt+Tab functionality appears, and not Switcheroo
Is the solution to run Switcheroo as administrator? Or is there another way?
|
process
|
alt tab integration doesn t work for administrator owned processes turn on alt tab integration in switcheroo open e g cmd with run as administrator press alt tab while having the cmd window in focus the usual windows alt tab functionality appears and not switcheroo is the solution to run switcheroo as administrator or is there another way
| 1
|
60,688
| 14,589,316,051
|
IssuesEvent
|
2020-12-19 01:23:07
|
LevyForchh/superglue
|
https://api.github.com/repos/LevyForchh/superglue
|
opened
|
CVE-2020-35491 (Medium) detected in jackson-databind-2.9.6.jar, jackson-databind-2.8.11.3.jar
|
security vulnerability
|
## CVE-2020-35491 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.8.11.3.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar</p>
<p>
Dependency Hierarchy:
- logstash-logback-encoder-4.11.jar (Root Library)
- :x: **jackson-databind-2.9.6.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.11.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.3/844df5aba5a1a56e00905b165b12bb34116ee858/jackson-databind-2.8.11.3.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.3/844df5aba5a1a56e00905b165b12bb34116ee858/jackson-databind-2.8.11.3.jar</p>
<p>
Dependency Hierarchy:
- logstash-logback-encoder-4.11.jar (Root Library)
- :x: **jackson-databind-2.8.11.3.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.datasources.SharedPoolDataSource.
<p>Publish Date: 2020-12-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-35491>CVE-2020-35491</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2986">https://github.com/FasterXML/jackson-databind/issues/2986</a></p>
<p>Release Date: 2020-12-17</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","isTransitiveDependency":true,"dependencyTree":"net.logstash.logback:logstash-logback-encoder:4.11;com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.3","isTransitiveDependency":true,"dependencyTree":"net.logstash.logback:logstash-logback-encoder:4.11;com.fasterxml.jackson.core:jackson-databind:2.8.11.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"vulnerabilityIdentifier":"CVE-2020-35491","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-35491","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-35491 (Medium) detected in jackson-databind-2.9.6.jar, jackson-databind-2.8.11.3.jar - ## CVE-2020-35491 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.8.11.3.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar</p>
<p>
Dependency Hierarchy:
- logstash-logback-encoder-4.11.jar (Root Library)
- :x: **jackson-databind-2.9.6.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.11.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.3/844df5aba5a1a56e00905b165b12bb34116ee858/jackson-databind-2.8.11.3.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.3/844df5aba5a1a56e00905b165b12bb34116ee858/jackson-databind-2.8.11.3.jar</p>
<p>
Dependency Hierarchy:
- logstash-logback-encoder-4.11.jar (Root Library)
- :x: **jackson-databind-2.8.11.3.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.datasources.SharedPoolDataSource.
<p>Publish Date: 2020-12-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-35491>CVE-2020-35491</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2986">https://github.com/FasterXML/jackson-databind/issues/2986</a></p>
<p>Release Date: 2020-12-17</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","isTransitiveDependency":true,"dependencyTree":"net.logstash.logback:logstash-logback-encoder:4.11;com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.3","isTransitiveDependency":true,"dependencyTree":"net.logstash.logback:logstash-logback-encoder:4.11;com.fasterxml.jackson.core:jackson-databind:2.8.11.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"vulnerabilityIdentifier":"CVE-2020-35491","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.datasources.SharedPoolDataSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-35491","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jackson databind jar jackson databind jar cve medium severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy logstash logback encoder jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy logstash logback encoder jar root library x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons datasources sharedpooldatasource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons datasources sharedpooldatasource vulnerabilityurl
| 0
|
16,439
| 21,317,069,211
|
IssuesEvent
|
2022-04-16 13:16:16
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Errors caused by link to topic with xref in shortdesc
|
priority/medium preprocess preprocess2 stale
|
## Expected Behavior
When I have the following condition:
1. Topic contains `<xref>` to another topic `xref-in-shortdesc.dita`
1. The referenced topic has a short description, *and* that short description also contains an `<xref>` to `finaltarget.dita`
1. The xref inside that short description is missing link text or `<desc>`
The `topicpull` step ends up grabbing the short description from the middle file `xref-in-shortdesc.dita`, putting that in a variable, and then processing the variable. The result is that the `<xref>` to `finaltarget.dita` is processed again but in the context of that variable, which causes the processor to look for it in our XSLT directory, and ends with errors. Happens with both `preprocess` and `preprocess2`, here is the error log from `preprocess`:
```
C:\DCS\test\apiref>\dita-ot\dita-ot-3.1.2\bin\dita -i in.ditamap -f html5
[topicpull] I/O error reported by XML parser processing file:/C:/DITA-OT/dita-ot-3.1.2/xsl/preprocess/finaltarget.dita: C:\DITA-OT\dita-ot-3.1.2\xsl\preprocess\finaltarget.dita (The system cannot find the file specified.)
[topicpull] file:/C:/DCS/test/apiref/xref-in-shortdesc.dita:5:68: [DOTX031E][ERROR]: The file finaltarget.dita is not available to resolve link information.
```
This worked at some point in the past (at least, the `xref` inside shortdesc was processed to get text) but could have been broken as far back as 1.8 -- the text was pulled and no errors were thrown.
## Actual Behavior
Errors appear as shown above. `preprocess2` has the same issue:
```
C:\DCS\test\apiref>\dita-ot\dita-ot-3.1.2\bin\dita -i in.ditamap -f pdf
[topicpull] I/O error reported by XML parser processing file:/C:/DITA-OT/dita-ot-3.1.2/xsl/preprocess/dd3f451dd1448c00ddc872728d9573466227ad65.dita: C:\DITA-OT\dita-ot-3.1.2\xsl\preprocess\dd3f451dd1448c00ddc872728d9573466227ad65.dita (Thesystem cannot find the file specified.)
[topicpull] file:/C:/DCS/test/apiref/xref-in-shortdesc.dita:5:68: [DOTX031E][ERROR]: The file dd3f451dd1448c00ddc872728d9573466227ad65.dita is not available toresolve link information.
```
## Possible Solution
Once everything is in the variable context, I'm not sure how easy it is to get away from this. Most likely, the only way around it is to update the `copy-shortdesc` mode so that when it finds an xref, it actually goes to process the file, but I'm not sure yet.
## Steps to Reproduce
Samples attached:
[xrefshortdesc.zip](https://github.com/dita-ot/dita-ot/files/2412523/xrefshortdesc.zip)
## Copy of the error message, log file or stack trace
Shown above
## Environment
* DITA-OT version: 3.1.2
* Operating system and version: _Windows_
* How did you run DITA-OT? _dita command_
* Transformation type: _html5_ and _pdf_
|
2.0
|
Errors caused by link to topic with xref in shortdesc - ## Expected Behavior
When I have the following condition:
1. Topic contains `<xref>` to another topic `xref-in-shortdesc.dita`
1. The referenced topic has a short description, *and* that short description also contains an `<xref>` to `finaltarget.dita`
1. The xref inside that short description is missing link text or `<desc>`
The `topicpull` step ends up grabbing the short description from the middle file `xref-in-shortdesc.dita`, putting that in a variable, and then processing the variable. The result is that the `<xref>` to `finaltarget.dita` is processed again but in the context of that variable, which causes the processor to look for it in our XSLT directory, and ends with errors. Happens with both `preprocess` and `preprocess2`, here is the error log from `preprocess`:
```
C:\DCS\test\apiref>\dita-ot\dita-ot-3.1.2\bin\dita -i in.ditamap -f html5
[topicpull] I/O error reported by XML parser processing file:/C:/DITA-OT/dita-ot-3.1.2/xsl/preprocess/finaltarget.dita: C:\DITA-OT\dita-ot-3.1.2\xsl\preprocess\finaltarget.dita (The system cannot find the file specified.)
[topicpull] file:/C:/DCS/test/apiref/xref-in-shortdesc.dita:5:68: [DOTX031E][ERROR]: The file finaltarget.dita is not available to resolve link information.
```
This worked at some point in the past (at least, the `xref` inside shortdesc was processed to get text) but could have been broken as far back as 1.8 -- the text was pulled and no errors were thrown.
## Actual Behavior
Errors appear as shown above. `preprocess2` has the same issue:
```
C:\DCS\test\apiref>\dita-ot\dita-ot-3.1.2\bin\dita -i in.ditamap -f pdf
[topicpull] I/O error reported by XML parser processing file:/C:/DITA-OT/dita-ot-3.1.2/xsl/preprocess/dd3f451dd1448c00ddc872728d9573466227ad65.dita: C:\DITA-OT\dita-ot-3.1.2\xsl\preprocess\dd3f451dd1448c00ddc872728d9573466227ad65.dita (Thesystem cannot find the file specified.)
[topicpull] file:/C:/DCS/test/apiref/xref-in-shortdesc.dita:5:68: [DOTX031E][ERROR]: The file dd3f451dd1448c00ddc872728d9573466227ad65.dita is not available toresolve link information.
```
## Possible Solution
Once everything is in the variable context, I'm not sure how easy it is to get away from this. Most likely, the only way around it is to update the `copy-shortdesc` mode so that when it finds an xref, it actually goes to process the file, but I'm not sure yet.
## Steps to Reproduce
Samples attached:
[xrefshortdesc.zip](https://github.com/dita-ot/dita-ot/files/2412523/xrefshortdesc.zip)
## Copy of the error message, log file or stack trace
Shown above
## Environment
* DITA-OT version: 3.1.2
* Operating system and version: _Windows_
* How did you run DITA-OT? _dita command_
* Transformation type: _html5_ and _pdf_
|
process
|
errors caused by link to topic with xref in shortdesc expected behavior when i have the following condition topic contains to another topic xref in shortdesc dita the referenced topic has a short description and that short description also contains an to finaltarget dita the xref inside that short description is missing link text or the topicpull step ends up grabbing the short description from the middle file xref in shortdesc dita putting that in a variable and then processing the variable the result is that the to finaltarget dita is processed again but in the context of that variable which causes the processor to look for it in our xslt directory and ends with errors happens with both preprocess and here is the error log from preprocess c dcs test apiref dita ot dita ot bin dita i in ditamap f i o error reported by xml parser processing file c dita ot dita ot xsl preprocess finaltarget dita c dita ot dita ot xsl preprocess finaltarget dita the system cannot find the file specified file c dcs test apiref xref in shortdesc dita the file finaltarget dita is not available to resolve link information this worked at some point in the past at least the xref inside shortdesc was processed to get text but could have been broken as far back as the text was pulled and no errors were thrown actual behavior errors appear as shown above has the same issue c dcs test apiref dita ot dita ot bin dita i in ditamap f pdf i o error reported by xml parser processing file c dita ot dita ot xsl preprocess dita c dita ot dita ot xsl preprocess dita thesystem cannot find the file specified file c dcs test apiref xref in shortdesc dita the file dita is not available toresolve link information possible solution once everything is in the variable context i m not sure how easy it is to get away from this most likely the only way around it is to update the copy shortdesc mode so that when it finds an xref it actually goes to process the file but i m not sure yet steps to reproduce samples attached copy of the error message log file or stack trace shown above environment dita ot version operating system and version windows how did you run dita ot dita command transformation type and pdf
| 1
|
112,697
| 24,314,968,585
|
IssuesEvent
|
2022-09-30 04:55:05
|
appsmithorg/appsmith
|
https://api.github.com/repos/appsmithorg/appsmith
|
closed
|
[Feature] HubSpot CMS Performance API Saas Integration
|
Enhancement BE Coders Pod SAAS Plugins Integrations Pod
|
## HubSpot CMS Performance API Saas Integration
### Description:
With the CMS Performance API, you can monitor your HubSpot-hosted pages for performance issues, including: uptime, status, 1xx- 5xx errors, cache hits, total requests, and median response times.
------------------------
### Documentation:
https://developers.hubspot.com/docs/api/cms/performance
------------------------
### TO-DO
------------------------
### API Actions Progress
| Action | EngSpec Doc | SaaS Manager App | SaaS App Run | Test App Run | QA |
| --- | --- | --- | --- | --- | --- |
| View Website's Performance | Yes | Yes | Yes | No | No |
| View Website's Uptime | Yes | Yes | Yes | No | No |
------------------------
### Notes
|
1.0
|
[Feature] HubSpot CMS Performance API Saas Integration - ## HubSpot CMS Performance API Saas Integration
### Description:
With the CMS Performance API, you can monitor your HubSpot-hosted pages for performance issues, including: uptime, status, 1xx- 5xx errors, cache hits, total requests, and median response times.
------------------------
### Documentation:
https://developers.hubspot.com/docs/api/cms/performance
------------------------
### TO-DO
------------------------
### API Actions Progress
| Action | EngSpec Doc | SaaS Manager App | SaaS App Run | Test App Run | QA |
| --- | --- | --- | --- | --- | --- |
| View Website's Performance | Yes | Yes | Yes | No | No |
| View Website's Uptime | Yes | Yes | Yes | No | No |
------------------------
### Notes
|
non_process
|
hubspot cms performance api saas integration hubspot cms performance api saas integration description with the cms performance api you can monitor your hubspot hosted pages for performance issues including uptime status errors cache hits total requests and median response times documentation to do api actions progress action engspec doc saas manager app saas app run test app run qa view website s performance yes yes yes no no view website s uptime yes yes yes no no notes
| 0
|
116,645
| 9,875,308,177
|
IssuesEvent
|
2019-06-23 10:35:32
|
WoWManiaUK/Blackwing-Lair
|
https://api.github.com/repos/WoWManiaUK/Blackwing-Lair
|
closed
|
[NPC][Quest] Hold the line - ID 14347 - Gilneas
|
Fixed Confirmed Fixed in Dev Priority-High Regression Starting Zone Test in progress
|
**Links:**
Quest: https://www.wowhead.com/quest=14347/hold-the-line
NPC: https://www.wowhead.com/npc=34511/forsaken-invader
**What is happening:**
I only found 7/10 forsaken Invaders. There are no further mobs in this area. Its just empty.
Even after 90 Minutes they did not respawn. I hope i don´t have to wait for a server restart for them to respawn.


**What should happen:**
Compared to official server, there are a couple of those forsaken invaders right at the questgiver.

|
1.0
|
[NPC][Quest] Hold the line - ID 14347 - Gilneas - **Links:**
Quest: https://www.wowhead.com/quest=14347/hold-the-line
NPC: https://www.wowhead.com/npc=34511/forsaken-invader
**What is happening:**
I only found 7/10 forsaken Invaders. There are no further mobs in this area. Its just empty.
Even after 90 Minutes they did not respawn. I hope i don´t have to wait for a server restart for them to respawn.


**What should happen:**
Compared to official server, there are a couple of those forsaken invaders right at the questgiver.

|
non_process
|
hold the line id gilneas links quest npc what is happening i only found forsaken invaders there are no further mobs in this area its just empty even after minutes they did not respawn i hope i don´t have to wait for a server restart for them to respawn what should happen compared to official server there are a couple of those forsaken invaders right at the questgiver
| 0
|
12,094
| 14,740,088,461
|
IssuesEvent
|
2021-01-07 08:29:47
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Winnipeg - SA Billing - Late Fee Account List
|
anc-process anp-important ant-bug has attachment
|
In GitLab by @kdjstudios on Oct 3, 2018, 11:10
[Winnipeg.xlsx](/uploads/f715f2d76e02aaf1c4c017c8f661cb3e/Winnipeg.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-33332
|
1.0
|
Winnipeg - SA Billing - Late Fee Account List - In GitLab by @kdjstudios on Oct 3, 2018, 11:10
[Winnipeg.xlsx](/uploads/f715f2d76e02aaf1c4c017c8f661cb3e/Winnipeg.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-33332
|
process
|
winnipeg sa billing late fee account list in gitlab by kdjstudios on oct uploads winnipeg xlsx hd
| 1
|
466,298
| 13,399,506,584
|
IssuesEvent
|
2020-09-03 14:34:41
|
projectcalico/typha
|
https://api.github.com/repos/projectcalico/typha
|
reopened
|
Typha cannot be recovered when it loses the connection to kube-apiserver
|
kind/bug priority/P1
|
## Expected Behavior
When typha lost the connection to the kube-apiserver, I expect that typha tries to recover automatically by retrying. Or, it should set liveness probe to false or exit the process to enable for k8s to operate typha Pod lifecycle.
I noticed the code had been written to retry to connect the kube-apiserver, but it didn't work.
## Current Behavior
When typha lost the connection to the kube-apiserver, typha does not retry (and does not set liveness probe to false or exit the process).
I think, because of the above typha behavior, calico-node could not access to typha, then calico-node could not become ready.
## Possible Solution
For a workaround, I deleted typha Pod manually (i.e. recreated the Pod). After that, typha worked correctly and calico-node could access typha.
I cannot make clear why typha process does not retry to connect the kube-apiserver. However, I think, when list/watch fails, typha should set liveness=false, or exit its process to be restarted by k8s.
## Context
This problem occurred when I try to update k8s from v1.15 to v1.16 without any Pod evacuations. After the update was completed, typha and calico-node Pods were restarted by new kubelets.
Some calico-node Pods outputted the following logs repeatedly:
```
2019-12-23 06:52:03.624 [ERROR][18432] main.go 96: Failed to connect to Typha. Retrying... error=dial tcp 10.68.51.44:5473: connect: connection refused
2019-12-23 06:52:03.624 [INFO][18432] sync_client.go 169: Starting Typha client
2019-12-23 06:52:03.624 [INFO][18432] sync_client.go 174: requiringTLS=false
```
typha Pod outputted the following logs only once:
[typha.log](https://gist.github.com/kfyharukz/f231e451721538808248e42612a5a1c0)
## My Environment
It occurred when updating k8s v1.15 to v1.16.
The container image of typha and calico-node was built by the [Dockerfile](https://github.com/cybozu/neco-containers/blob/master/calico/Dockerfile) and calico was deployed with the [manifest](https://github.com/cybozu-go/neco-apps/tree/master/network-policy/base/calico).
## Note
I think reproduction is difficult, however, I saw this situation twice in the same shaped infrastructures. Please provide any additional information if anyone sees the same problem.
|
1.0
|
Typha cannot be recovered when it loses the connection to kube-apiserver - ## Expected Behavior
When typha lost the connection to the kube-apiserver, I expect that typha tries to recover automatically by retrying. Or, it should set liveness probe to false or exit the process to enable for k8s to operate typha Pod lifecycle.
I noticed the code had been written to retry to connect the kube-apiserver, but it didn't work.
## Current Behavior
When typha lost the connection to the kube-apiserver, typha does not retry (and does not set liveness probe to false or exit the process).
I think, because of the above typha behavior, calico-node could not access to typha, then calico-node could not become ready.
## Possible Solution
For a workaround, I deleted typha Pod manually (i.e. recreated the Pod). After that, typha worked correctly and calico-node could access typha.
I cannot make clear why typha process does not retry to connect the kube-apiserver. However, I think, when list/watch fails, typha should set liveness=false, or exit its process to be restarted by k8s.
## Context
This problem occurred when I try to update k8s from v1.15 to v1.16 without any Pod evacuations. After the update was completed, typha and calico-node Pods were restarted by new kubelets.
Some calico-node Pods outputted the following logs repeatedly:
```
2019-12-23 06:52:03.624 [ERROR][18432] main.go 96: Failed to connect to Typha. Retrying... error=dial tcp 10.68.51.44:5473: connect: connection refused
2019-12-23 06:52:03.624 [INFO][18432] sync_client.go 169: Starting Typha client
2019-12-23 06:52:03.624 [INFO][18432] sync_client.go 174: requiringTLS=false
```
typha Pod outputted the following logs only once:
[typha.log](https://gist.github.com/kfyharukz/f231e451721538808248e42612a5a1c0)
## My Environment
It occurred when updating k8s v1.15 to v1.16.
The container image of typha and calico-node was built by the [Dockerfile](https://github.com/cybozu/neco-containers/blob/master/calico/Dockerfile) and calico was deployed with the [manifest](https://github.com/cybozu-go/neco-apps/tree/master/network-policy/base/calico).
## Note
I think reproduction is difficult, however, I saw this situation twice in the same shaped infrastructures. Please provide any additional information if anyone sees the same problem.
|
non_process
|
typha cannot be recovered when it loses the connection to kube apiserver expected behavior when typha lost the connection to the kube apiserver i expect that typha tries to recover automatically by retrying or it should set liveness probe to false or exit the process to enable for to operate typha pod lifecycle i noticed the code had been written to retry to connect the kube apiserver but it didn t work current behavior when typha lost the connection to the kube apiserver typha does not retry and does not set liveness probe to false or exit the process i think because of the above typha behavior calico node could not access to typha then calico node could not become ready possible solution for a workaround i deleted typha pod manually i e recreated the pod after that typha worked correctly and calico node could access typha i cannot make clear why typha process does not retry to connect the kube apiserver however i think when list watch fails typha should set liveness false or exit its process to be restarted by context this problem occurred when i try to update from to without any pod evacuations after the update was completed typha and calico node pods were restarted by new kubelets some calico node pods outputted the following logs repeatedly main go failed to connect to typha retrying error dial tcp connect connection refused sync client go starting typha client sync client go requiringtls false typha pod outputted the following logs only once my environment it occurred when updating to the container image of typha and calico node was built by the and calico was deployed with the note i think reproduction is difficult however i saw this situation twice in the same shaped infrastructures please provide any additional information if anyone sees the same problem
| 0
|
62,622
| 7,613,135,382
|
IssuesEvent
|
2018-05-01 20:06:12
|
pandas-dev/pandas
|
https://api.github.com/repos/pandas-dev/pandas
|
closed
|
BUG: concat unwantedly sorts DataFrame column names if they differ
|
API Design Difficulty Intermediate Effort Low Prio-high Reshaping
|
When concat'ing DataFrames, the column names get alphanumerically sorted if there are any differences between them. If they're identical across DataFrames, they don't get sorted.
This sort is undocumented and unwanted. Certainly the default behavior should be no-sort.
``` python
df4a = DataFrame(columns=['C','B','D','A'], data=np.random.randn(3,4))
df4b = DataFrame(columns=['C','B','D','A'], data=np.random.randn(3,4))
df5 = DataFrame(columns=['C','B','E','D','A'], data=np.random.randn(3,5))
print "Cols unsorted:", concat([df4a,df4b])
# Cols unsorted: C B D A
print "Cols sorted", concat([df4a,df5])
# Cols sorted A B C D E
``'
```
|
1.0
|
BUG: concat unwantedly sorts DataFrame column names if they differ - When concat'ing DataFrames, the column names get alphanumerically sorted if there are any differences between them. If they're identical across DataFrames, they don't get sorted.
This sort is undocumented and unwanted. Certainly the default behavior should be no-sort.
``` python
df4a = DataFrame(columns=['C','B','D','A'], data=np.random.randn(3,4))
df4b = DataFrame(columns=['C','B','D','A'], data=np.random.randn(3,4))
df5 = DataFrame(columns=['C','B','E','D','A'], data=np.random.randn(3,5))
print "Cols unsorted:", concat([df4a,df4b])
# Cols unsorted: C B D A
print "Cols sorted", concat([df4a,df5])
# Cols sorted A B C D E
``'
```
|
non_process
|
bug concat unwantedly sorts dataframe column names if they differ when concat ing dataframes the column names get alphanumerically sorted if there are any differences between them if they re identical across dataframes they don t get sorted this sort is undocumented and unwanted certainly the default behavior should be no sort python dataframe columns data np random randn dataframe columns data np random randn dataframe columns data np random randn print cols unsorted concat cols unsorted c b d a print cols sorted concat cols sorted a b c d e
| 0
|
99,490
| 8,701,620,949
|
IssuesEvent
|
2018-12-05 12:07:43
|
Orlandohub/haha-studio
|
https://api.github.com/repos/Orlandohub/haha-studio
|
closed
|
Run E2E tests with Docker
|
E2E tests
|
1 - install Docker (https://store.docker.com/editions/community/docker-ce-desktop-windows)
2 - pull latest from `master`
3 - run `docker build -t cypress-orlando:latest .` command (notice there's a dot at the end which **MUST** be included. Also this will take a long time to process)
4 - run `docker run cypress-orlando:latest`
|
1.0
|
Run E2E tests with Docker - 1 - install Docker (https://store.docker.com/editions/community/docker-ce-desktop-windows)
2 - pull latest from `master`
3 - run `docker build -t cypress-orlando:latest .` command (notice there's a dot at the end which **MUST** be included. Also this will take a long time to process)
4 - run `docker run cypress-orlando:latest`
|
non_process
|
run tests with docker install docker pull latest from master run docker build t cypress orlando latest command notice there s a dot at the end which must be included also this will take a long time to process run docker run cypress orlando latest
| 0
|
22,395
| 31,142,287,899
|
IssuesEvent
|
2023-08-16 01:44:25
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Flaky test: Timed out retrying after 4000ms: expected 0 to be one of [ 200, 304 ]
|
OS: linux process: flaky test topic: flake ❄️ stage: flake stale
|
### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/37673/test-results/dd3f8853-752f-4b8a-adde-177542278f02
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/xhr.cy.js#L2408
### Analysis
<img width="420" alt="Screen Shot 2022-08-10 at 9 32 32 AM" src="https://user-images.githubusercontent.com/26726429/183964307-c58d1b66-0528-44bc-8c18-ab05b77ee1aa.png">
### Cypress Version
10.4.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
1.0
|
Flaky test: Timed out retrying after 4000ms: expected 0 to be one of [ 200, 304 ] - ### Link to dashboard or CircleCI failure
https://dashboard.cypress.io/projects/ypt4pf/runs/37673/test-results/dd3f8853-752f-4b8a-adde-177542278f02
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/xhr.cy.js#L2408
### Analysis
<img width="420" alt="Screen Shot 2022-08-10 at 9 32 32 AM" src="https://user-images.githubusercontent.com/26726429/183964307-c58d1b66-0528-44bc-8c18-ab05b77ee1aa.png">
### Cypress Version
10.4.0
### Other
Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
process
|
flaky test timed out retrying after expected to be one of link to dashboard or circleci failure link to failing test in github analysis img width alt screen shot at am src cypress version other search for this issue number in the codebase to find the test s skipped until this issue is fixed
| 1
|
594
| 3,069,019,463
|
IssuesEvent
|
2015-08-18 18:25:21
|
PHPOffice/PHPWord
|
https://api.github.com/repos/PHPOffice/PHPWord
|
reopened
|
Template too slow after 100 record.
|
Consulting Request Responded Template Processor
|
Hi,
Thanks for your product.
I have problem with template clonerow.
I have approx 150 record and template create document approx 500 second. is it normal?
|
1.0
|
Template too slow after 100 record. - Hi,
Thanks for your product.
I have problem with template clonerow.
I have approx 150 record and template create document approx 500 second. is it normal?
|
process
|
template too slow after record hi thanks for your product i have problem with template clonerow i have approx record and template create document approx second is it normal
| 1
|
9,964
| 13,003,324,328
|
IssuesEvent
|
2020-07-24 06:26:22
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Refactor Fields Expression Dialogue does not recognise fields from the selected table
|
Bug Processing
|
**Describe the bug**
when developing an expression in the "Expression Dialog" of the "Refactor Fields" tool, the preview reports that the "Expression is invalid" when entering a field that exists in the table that the refactor field is operating on. This is despite the field existing and the expression being valid
**How to Reproduce**
1. Use the refactor fields tool from the processing toolbox on any feature layer
2. go into the "Expression Dialog" for any field to be mapped
3. even with the default expression of the current field that is mapped the expression is shown as invalid
4. When clicking on "(more info)" the user is presented with "Eval Error: Column '<chosen_field>' not found" where the <chosen_field> is the desired field to be mapped.
**QGIS and OS versions**
QGIS version
3.14.1-Pi
QGIS code revision
de08d6b71d
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
crayfish;
FreehandRasterGeoreferencer;
go2streetview;
icsm_ntv2_transformer;
kmltools;
LAStools;
latlontools;
mapswipetool_plugin;
mmqgis;
NNJoin;
openlayers_plugin;
OSMDownloader;
qconsolidate3;
qNote;
SDEllipse;
searchlayers;
SRTM-Downloader;
StreetView;
TerrainShading;
timemanager;
valuetool;
db_manager;
MetaSearch;
processing
|
1.0
|
Refactor Fields Expression Dialogue does not recognise fields from the selected table - **Describe the bug**
when developing an expression in the "Expression Dialog" of the "Refactor Fields" tool, the preview reports that the "Expression is invalid" when entering a field that exists in the table that the refactor field is operating on. This is despite the field existing and the expression being valid
**How to Reproduce**
1. Use the refactor fields tool from the processing toolbox on any feature layer
2. go into the "Expression Dialog" for any field to be mapped
3. even with the default expression of the current field that is mapped the expression is shown as invalid
4. When clicking on "(more info)" the user is presented with "Eval Error: Column '<chosen_field>' not found" where the <chosen_field> is the desired field to be mapped.
**QGIS and OS versions**
QGIS version
3.14.1-Pi
QGIS code revision
de08d6b71d
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
crayfish;
FreehandRasterGeoreferencer;
go2streetview;
icsm_ntv2_transformer;
kmltools;
LAStools;
latlontools;
mapswipetool_plugin;
mmqgis;
NNJoin;
openlayers_plugin;
OSMDownloader;
qconsolidate3;
qNote;
SDEllipse;
searchlayers;
SRTM-Downloader;
StreetView;
TerrainShading;
timemanager;
valuetool;
db_manager;
MetaSearch;
processing
|
process
|
refactor fields expression dialogue does not recognise fields from the selected table describe the bug when developing an expression in the expression dialog of the refactor fields tool the preview reports that the expression is invalid when entering a field that exists in the table that the refactor field is operating on this is despite the field existing and the expression being valid how to reproduce use the refactor fields tool from the processing toolbox on any feature layer go into the expression dialog for any field to be mapped even with the default expression of the current field that is mapped the expression is shown as invalid when clicking on more info the user is presented with eval error column not found where the is the desired field to be mapped qgis and os versions qgis version pi qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins crayfish freehandrastergeoreferencer icsm transformer kmltools lastools latlontools mapswipetool plugin mmqgis nnjoin openlayers plugin osmdownloader qnote sdellipse searchlayers srtm downloader streetview terrainshading timemanager valuetool db manager metasearch processing
| 1
|
355,699
| 10,583,612,693
|
IssuesEvent
|
2019-10-08 14:01:40
|
angular/angular-cli
|
https://api.github.com/repos/angular/angular-cli
|
closed
|
Empty bundles when using AngularCompilerPlugin
|
priority: 2 (required) severity3: broken
|
### Versions
```
angular: 5.1.0
node: 6.10.0
npm: 5.4.6
Windows 10
```
### Repro steps
* Take working AOT-compile-able project, change `AotPlugin` to `AngularCompilerPlugin`
* Run your build
* Observe that bundles are only 8kb
### Observed behavior
The builds succeed, but the output files contain no code from the project. Example app.min.js file:
```
webpackJsonp([1],[function(n,c){}],[0]);
```
### Desired behavior
I would like to see my code in bundles
### Mention any other details that might be useful (optional)
|
1.0
|
Empty bundles when using AngularCompilerPlugin - ### Versions
```
angular: 5.1.0
node: 6.10.0
npm: 5.4.6
Windows 10
```
### Repro steps
* Take working AOT-compile-able project, change `AotPlugin` to `AngularCompilerPlugin`
* Run your build
* Observe that bundles are only 8kb
### Observed behavior
The builds succeed, but the output files contain no code from the project. Example app.min.js file:
```
webpackJsonp([1],[function(n,c){}],[0]);
```
### Desired behavior
I would like to see my code in bundles
### Mention any other details that might be useful (optional)
|
non_process
|
empty bundles when using angularcompilerplugin versions angular node npm windows repro steps take working aot compile able project change aotplugin to angularcompilerplugin run your build observe that bundles are only observed behavior the builds succeed but the output files contain no code from the project example app min js file webpackjsonp desired behavior i would like to see my code in bundles mention any other details that might be useful optional
| 0
|
16,861
| 22,142,650,921
|
IssuesEvent
|
2022-06-03 08:34:57
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Error: Error in migration engine. Reason: [migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target
|
bug/2-confirmed kind/bug process/candidate tech/engines/migration engine topic: error reporting team/schema
|
<!-- If required, please update the title to be clear and descriptive -->
We are entering unreachable code, so there is a logic error somewhere.
Command: `prisma migrate diff --from-migrations prisma/migrations/ --to-migrations prisma/_migrations --script`
Version: `3.14.0`
Binary Version: `2b0c12756921c891fec4f68d9444e18c7d5d4a6a`
Report: https://prisma-errors.netlify.app/report/14009
OS: `x64 linux 5.15.0-33-generic`
JS Stacktrace:
```
Error: Error in migration engine.
Reason: [migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target
Please create an issue with your `schema.prisma` at
https://github.com/prisma/prisma/issues/new
at handlePanic (/tmp/a/node_modules/.pnpm/prisma@3.14.0/node_modules/prisma/build/index.js:112999:25)
at ChildProcess.<anonymous> (/tmp/a/node_modules/.pnpm/prisma@3.14.0/node_modules/prisma/build/index.js:113008:15)
at ChildProcess.emit (events.js:400:28)
at Process.ChildProcess._handle.onexit (internal/child_process.js:282:12)
```
Rust Stacktrace:
```
Starting migration engine RPC server
[migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target
```
|
1.0
|
Error: Error in migration engine. Reason: [migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target - <!-- If required, please update the title to be clear and descriptive -->
We are entering unreachable code, so there is a logic error somewhere.
Command: `prisma migrate diff --from-migrations prisma/migrations/ --to-migrations prisma/_migrations --script`
Version: `3.14.0`
Binary Version: `2b0c12756921c891fec4f68d9444e18c7d5d4a6a`
Report: https://prisma-errors.netlify.app/report/14009
OS: `x64 linux 5.15.0-33-generic`
JS Stacktrace:
```
Error: Error in migration engine.
Reason: [migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target
Please create an issue with your `schema.prisma` at
https://github.com/prisma/prisma/issues/new
at handlePanic (/tmp/a/node_modules/.pnpm/prisma@3.14.0/node_modules/prisma/build/index.js:112999:25)
at ChildProcess.<anonymous> (/tmp/a/node_modules/.pnpm/prisma@3.14.0/node_modules/prisma/build/index.js:113008:15)
at ChildProcess.emit (events.js:400:28)
at Process.ChildProcess._handle.onexit (internal/child_process.js:282:12)
```
Rust Stacktrace:
```
Starting migration engine RPC server
[migration-engine/core/src/commands/diff.rs:127:22] internal error: entered unreachable code: no provider, no shadow database url for migrations target
```
|
process
|
error error in migration engine reason internal error entered unreachable code no provider no shadow database url for migrations target we are entering unreachable code so there is a logic error somewhere command prisma migrate diff from migrations prisma migrations to migrations prisma migrations script version binary version report os linux generic js stacktrace error error in migration engine reason internal error entered unreachable code no provider no shadow database url for migrations target please create an issue with your schema prisma at at handlepanic tmp a node modules pnpm prisma node modules prisma build index js at childprocess tmp a node modules pnpm prisma node modules prisma build index js at childprocess emit events js at process childprocess handle onexit internal child process js rust stacktrace starting migration engine rpc server internal error entered unreachable code no provider no shadow database url for migrations target
| 1
|
12,237
| 14,743,709,527
|
IssuesEvent
|
2021-01-07 14:18:28
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Multiple Non Finalised Billing Cycle |Parent: 1574
|
anc-process anp-1 ant-bug has attachment
|
In GitLab by @smasih on Sep 19, 2019, 06:19
**Submitted by:**
**Helpdesk:**
**Server:** Internal
**Client/Site:** SA Hosted
**Account:** NA
**Issue:**
Fix the code that is causing multiple no-finalized billing cycles

|
1.0
|
Multiple Non Finalised Billing Cycle |Parent: 1574 - In GitLab by @smasih on Sep 19, 2019, 06:19
**Submitted by:**
**Helpdesk:**
**Server:** Internal
**Client/Site:** SA Hosted
**Account:** NA
**Issue:**
Fix the code that is causing multiple no-finalized billing cycles

|
process
|
multiple non finalised billing cycle parent in gitlab by smasih on sep submitted by helpdesk server internal client site sa hosted account na issue fix the code that is causing multiple no finalized billing cycles uploads image png
| 1
|
7,009
| 10,151,164,379
|
IssuesEvent
|
2019-08-05 19:35:38
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Synthesis failed for logging
|
api: logging autosynth failure type: process
|
Hello! Autosynth couldn't regenerate logging. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-logging'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/logging/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6929f343c400122d85818195b18613330a12a014bffc1e08499550d40571479d
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/logging/artman_logging.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging_config.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging_config.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/log_entry.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/log_entry.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging_metrics.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging_metrics.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_config_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_metrics_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/log_entry_pb2.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/metrics_service_v2_grpc_transport.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/logging_service_v2_grpc_transport.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/config_service_v2_grpc_transport.py.
.coveragerc
.flake8
MANIFEST.in
noxfile.py.j2
setup.cfg
Running session blacken
Creating virtualenv using python3.6 in .nox/blacken
pip install black
black google tests docs
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/config_service_v2_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/logging_service_v2_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/metrics_service_v2_client_config.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/config_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/logging_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/metrics_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/logging_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/log_entry_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/metrics_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/config_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_config_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_metrics_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/log_entry_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_metrics_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_config_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_logging_service_v2_client_v2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_config_service_v2_client_v2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_metrics_service_v2_client_v2.py
All done! 💥 💔 💥
18 files reformatted, 52 files left unchanged, 3 files failed to reformat.
Command black google tests docs failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/logging/synth.py", line 56, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
synthtool > Cleaned up 2 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/38334508-4b16-4eff-951d-9a2edc2c06fb).
|
1.0
|
Synthesis failed for logging - Hello! Autosynth couldn't regenerate logging. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-logging'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/logging/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6929f343c400122d85818195b18613330a12a014bffc1e08499550d40571479d
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/logging/artman_logging.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging_config.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging_config.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/log_entry.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/log_entry.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/logging/v2/logging_metrics.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto/logging_metrics.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/logging-v2/google/cloud/logging_v2/proto.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_config_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_metrics_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/logging_pb2.py.
synthtool > Replaced '(^.*$\\n)*' in google/cloud/logging_v2/proto/log_entry_pb2.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/metrics_service_v2_grpc_transport.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/logging_service_v2_grpc_transport.py.
synthtool > Replaced 'channel =.*\n(\\s+)address=.*\n\\s+credentials=.*,\n' in google/cloud/logging_v2/gapic/transports/config_service_v2_grpc_transport.py.
.coveragerc
.flake8
MANIFEST.in
noxfile.py.j2
setup.cfg
Running session blacken
Creating virtualenv using python3.6 in .nox/blacken
pip install black
black google tests docs
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/config_service_v2_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/logging_service_v2_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/metrics_service_v2_client_config.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/config_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/logging_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
error: cannot format /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/transports/metrics_service_v2_grpc_transport.py: cannot use --safe with this file; failed to parse source file with Python 3.6's builtin AST. Re-run with --fast or stop using deprecated Python 2 syntax. AST error message: keyword argument repeated (<unknown>, line 73)
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/logging_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/log_entry_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/metrics_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/gapic/config_service_v2_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_config_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_metrics_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/log_entry_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_metrics_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_config_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_logging_service_v2_client_v2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_config_service_v2_client_v2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/google/cloud/logging_v2/proto/logging_pb2.py
reformatted /tmpfs/src/git/autosynth/working_repo/logging/tests/unit/gapic/v2/test_metrics_service_v2_client_v2.py
All done! 💥 💔 💥
18 files reformatted, 52 files left unchanged, 3 files failed to reformat.
Command black google tests docs failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/logging/synth.py", line 56, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
synthtool > Cleaned up 2 temporary directories.
synthtool > Wrote metadata to synth.metadata.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/38334508-4b16-4eff-951d-9a2edc2c06fb).
|
process
|
synthesis failed for logging hello autosynth couldn t regenerate logging broken heart here s the output from running synth py cloning into working repo switched to branch autosynth logging running synthtool synthtool executing tmpfs src git autosynth working repo logging synth py synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis synthtool running generator for google logging artman logging yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python logging synthtool copy home kbuilder cache synthtool googleapis google logging logging proto to home kbuilder cache synthtool googleapis artman genfiles python logging google cloud logging proto logging proto synthtool copy home kbuilder cache synthtool googleapis google logging logging config proto to home kbuilder cache synthtool googleapis artman genfiles python logging google cloud logging proto logging config proto synthtool copy home kbuilder cache synthtool googleapis google logging log entry proto to home kbuilder cache synthtool googleapis artman genfiles python logging google cloud logging proto log entry proto synthtool copy home kbuilder cache synthtool googleapis google logging logging metrics proto to home kbuilder cache synthtool googleapis artman genfiles python logging google cloud logging proto logging metrics proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python logging google cloud logging proto synthtool replaced n in google cloud logging proto logging config py synthtool replaced n in google cloud logging proto logging metrics py synthtool replaced n in google cloud logging proto logging py synthtool replaced n in google cloud logging proto log entry py synthtool replaced channel n s address n s credentials n in google cloud logging gapic transports metrics service grpc transport py synthtool replaced channel n s address n s credentials n in google cloud logging gapic transports logging service grpc transport py synthtool replaced channel n s address n s credentials n in google cloud logging gapic transports config service grpc transport py coveragerc manifest in noxfile py setup cfg running session blacken creating virtualenv using in nox blacken pip install black black google tests docs reformatted tmpfs src git autosynth working repo logging google cloud logging gapic enums py reformatted tmpfs src git autosynth working repo logging google cloud logging gapic config service client config py reformatted tmpfs src git autosynth working repo logging google cloud logging gapic logging service client config py reformatted tmpfs src git autosynth working repo logging google cloud logging gapic metrics service client config py error cannot format tmpfs src git autosynth working repo logging google cloud logging gapic transports config service grpc transport py cannot use safe with this file failed to parse source file with python s builtin ast re run with fast or stop using deprecated python syntax ast error message keyword argument repeated line error cannot format tmpfs src git autosynth working repo logging google cloud logging gapic transports logging service grpc transport py cannot use safe with this file failed to parse source file with python s builtin ast re run with fast or stop using deprecated python syntax ast error message keyword argument repeated line error cannot format tmpfs src git autosynth working repo logging google cloud logging gapic transports metrics service grpc transport py cannot use safe with this file failed to parse source file with python s builtin ast re run with fast or stop using deprecated python syntax ast error message keyword argument repeated line reformatted tmpfs src git autosynth working repo logging google cloud logging gapic logging service client py reformatted tmpfs src git autosynth working repo logging google cloud logging proto log entry grpc py reformatted tmpfs src git autosynth working repo logging google cloud logging gapic metrics service client py reformatted tmpfs src git autosynth working repo logging google cloud logging gapic config service client py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging config grpc py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging metrics grpc py reformatted tmpfs src git autosynth working repo logging google cloud logging proto log entry py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging grpc py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging metrics py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging config py reformatted tmpfs src git autosynth working repo logging tests unit gapic test logging service client py reformatted tmpfs src git autosynth working repo logging tests unit gapic test config service client py reformatted tmpfs src git autosynth working repo logging google cloud logging proto logging py reformatted tmpfs src git autosynth working repo logging tests unit gapic test metrics service client py all done 💥 💔 💥 files reformatted files left unchanged files failed to reformat command black google tests docs failed with exit code session blacken failed synthtool failed executing nox s blacken none traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo logging synth py line in s shell run hide output false file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthtool cleaned up temporary directories synthtool wrote metadata to synth metadata synthesis failed google internal developers can see the full log
| 1
|
21,546
| 29,865,318,549
|
IssuesEvent
|
2023-06-20 02:56:44
|
cncf/tag-security
|
https://api.github.com/repos/cncf/tag-security
|
closed
|
[Sec Assess WG] Naming and Scope of assessments
|
help wanted assessment-process suggestion inactive
|
This issue was created from results of the Security Assessment Improvement Working Group (https://github.com/cncf/sig-security/issues/167#issuecomment-714514142).
# Naming and Scope of assessments
## Premise
- Assessment is an overloaded term, and can lead to confusion
## Ideas
- Have a better articulation of what is a sec assess.
- Include scope to include additional aspect of code audit related checks/certification
- Add mapping aspects of assessments to compliance frameworks
- Additional suggestion of scope to include related to security testing
- Implement hands-on security testing #394
- Define requirements for hands-on security testing / best practices
- Must be a process for onboarding and approving new "hands-on" testers
## Logistics
- [ ] Contributors (For multiple contributors, 1 lead to coordinate)
- Placeholder_1
- Placeholder_2
- [ ] SIG-Representative
|
1.0
|
[Sec Assess WG] Naming and Scope of assessments - This issue was created from results of the Security Assessment Improvement Working Group (https://github.com/cncf/sig-security/issues/167#issuecomment-714514142).
# Naming and Scope of assessments
## Premise
- Assessment is an overloaded term, and can lead to confusion
## Ideas
- Have a better articulation of what is a sec assess.
- Include scope to include additional aspect of code audit related checks/certification
- Add mapping aspects of assessments to compliance frameworks
- Additional suggestion of scope to include related to security testing
- Implement hands-on security testing #394
- Define requirements for hands-on security testing / best practices
- Must be a process for onboarding and approving new "hands-on" testers
## Logistics
- [ ] Contributors (For multiple contributors, 1 lead to coordinate)
- Placeholder_1
- Placeholder_2
- [ ] SIG-Representative
|
process
|
naming and scope of assessments this issue was created from results of the security assessment improvement working group naming and scope of assessments premise assessment is an overloaded term and can lead to confusion ideas have a better articulation of what is a sec assess include scope to include additional aspect of code audit related checks certification add mapping aspects of assessments to compliance frameworks additional suggestion of scope to include related to security testing implement hands on security testing define requirements for hands on security testing best practices must be a process for onboarding and approving new hands on testers logistics contributors for multiple contributors lead to coordinate placeholder placeholder sig representative
| 1
|
86,862
| 24,975,052,815
|
IssuesEvent
|
2022-11-02 06:53:40
|
google/mediapipe
|
https://api.github.com/repos/google/mediapipe
|
closed
|
objectdetection3d build to android sdk error: fetching 2 tflite files failed
|
type:build/install platform:android stat:awaiting response platform:tflite solution:object detection stalled
|
Hi~I have problems in building objectdetection3d to android sdk on Ubuntu. The error is:
```
FAILED: Build did NOT complete successfully (0 packages loaded, 0 targets configured)
Fetching @com_google_mediapipe_object_detection_ssd_mobilenetv2_oidv4_fp16_tflite; fetching
Fetching @com_google_mediapipe_object_detection_3d_sneakers_tflite; fetching
```
However, I succeeded in building objectdetectiongpu and a few other applications. I checked the tflite file and found it is in third_party/external_files.bzl, and also there is urls:
` http_file(
name = "com_google_mediapipe_object_detection_ssd_mobilenetv2_oidv4_fp16_tflite",
sha256 = "d0a5255bf8c4f5a0bc4240741a76c41d5e939f7655078f945f50ab53a9375da6",
urls = ["https://storage.googleapis.com/mediapipe-assets/object_detection_ssd_mobilenetv2_oidv4_fp16.tflite?generation=1661875879063676"],
)`
The urls is good that I can download tflite directly from browser but I can not download when I use bazel command to build as I questioned below.
Can anyone tells me how to fix it? Maybe I can copy the tflite files into some directories so that I can skip the downloading in bazel building?
Thanks!
|
1.0
|
objectdetection3d build to android sdk error: fetching 2 tflite files failed - Hi~I have problems in building objectdetection3d to android sdk on Ubuntu. The error is:
```
FAILED: Build did NOT complete successfully (0 packages loaded, 0 targets configured)
Fetching @com_google_mediapipe_object_detection_ssd_mobilenetv2_oidv4_fp16_tflite; fetching
Fetching @com_google_mediapipe_object_detection_3d_sneakers_tflite; fetching
```
However, I succeeded in building objectdetectiongpu and a few other applications. I checked the tflite file and found it is in third_party/external_files.bzl, and also there is urls:
` http_file(
name = "com_google_mediapipe_object_detection_ssd_mobilenetv2_oidv4_fp16_tflite",
sha256 = "d0a5255bf8c4f5a0bc4240741a76c41d5e939f7655078f945f50ab53a9375da6",
urls = ["https://storage.googleapis.com/mediapipe-assets/object_detection_ssd_mobilenetv2_oidv4_fp16.tflite?generation=1661875879063676"],
)`
The urls is good that I can download tflite directly from browser but I can not download when I use bazel command to build as I questioned below.
Can anyone tells me how to fix it? Maybe I can copy the tflite files into some directories so that I can skip the downloading in bazel building?
Thanks!
|
non_process
|
build to android sdk error fetching tflite files failed hi i have problems in building to android sdk on ubuntu the error is failed build did not complete successfully packages loaded targets configured fetching com google mediapipe object detection ssd tflite fetching fetching com google mediapipe object detection sneakers tflite fetching however i succeeded in building objectdetectiongpu and a few other applications i checked the tflite file and found it is in third party external files bzl and also there is urls http file name com google mediapipe object detection ssd tflite urls the urls is good that i can download tflite directly from browser but i can not download when i use bazel command to build as i questioned below can anyone tells me how to fix it maybe i can copy the tflite files into some directories so that i can skip the downloading in bazel building thanks
| 0
|
19,909
| 26,367,207,027
|
IssuesEvent
|
2023-01-11 17:29:20
|
alexandervantrijffel/upptimemonitoring
|
https://api.github.com/repos/alexandervantrijffel/upptimemonitoring
|
closed
|
🛑 Deloitte Process X-Ray is down
|
status deloitte-process-x-ray
|
In [`35f8b8f`](https://github.com/alexandervantrijffel/upptimemonitoring/commit/35f8b8fa0ccb921014ac91316477136d88f79175
), Deloitte Process X-Ray (https://processxray.deloitte.com/x/process-x-ray) was **down**:
- HTTP code: 404
- Response time: 785 ms
|
1.0
|
🛑 Deloitte Process X-Ray is down - In [`35f8b8f`](https://github.com/alexandervantrijffel/upptimemonitoring/commit/35f8b8fa0ccb921014ac91316477136d88f79175
), Deloitte Process X-Ray (https://processxray.deloitte.com/x/process-x-ray) was **down**:
- HTTP code: 404
- Response time: 785 ms
|
process
|
🛑 deloitte process x ray is down in deloitte process x ray was down http code response time ms
| 1
|
9,877
| 12,886,398,266
|
IssuesEvent
|
2020-07-13 09:25:24
|
dotenv-linter/dotenv-linter
|
https://api.github.com/repos/dotenv-linter/dotenv-linter
|
closed
|
Release v2.1.0
|
discussion help wanted process
|
What should be included in the new release v2.1.0.
Features:
- [x] Windows support (#211, #213, #216)
- [x] Recursive search for `.env` files (#223)
Improvements:
- [x] Optimize integration tests (#218)
- [x] Consider blank lines in UnorderedKey check (#221)
infrastructure:
- [x] Provide more short way to install dotenv-linter (#220)
---
What do you think should be included in the new release?
/cc @dotenv-linter/core
|
1.0
|
Release v2.1.0 - What should be included in the new release v2.1.0.
Features:
- [x] Windows support (#211, #213, #216)
- [x] Recursive search for `.env` files (#223)
Improvements:
- [x] Optimize integration tests (#218)
- [x] Consider blank lines in UnorderedKey check (#221)
infrastructure:
- [x] Provide more short way to install dotenv-linter (#220)
---
What do you think should be included in the new release?
/cc @dotenv-linter/core
|
process
|
release what should be included in the new release features windows support recursive search for env files improvements optimize integration tests consider blank lines in unorderedkey check infrastructure provide more short way to install dotenv linter what do you think should be included in the new release cc dotenv linter core
| 1
|
284,938
| 31,017,856,454
|
IssuesEvent
|
2023-08-10 01:03:55
|
amaybaum-dev/legend-depot-demo2
|
https://api.github.com/repos/amaybaum-dev/legend-depot-demo2
|
opened
|
legend-depot-store-mongo-1.7.6-SNAPSHOT.jar: 4 vulnerabilities (highest severity is: 7.5) reachable
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>legend-depot-store-mongo-1.7.6-SNAPSHOT.jar</b></p></summary>
<p></p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (legend-depot-store-mongo version) | Remediation Possible** | Reachability |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | --- |
| [CVE-2022-42004](https://www.mend.io/vulnerability-database/CVE-2022-42004) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌ |
| [CVE-2022-42003](https://www.mend.io/vulnerability-database/CVE-2022-42003) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
| [CVE-2020-36518](https://www.mend.io/vulnerability-database/CVE-2020-36518) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
| [CVE-2021-46877](https://www.mend.io/vulnerability-database/CVE-2021-46877) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p><p>**In some cases, Remediation PR cannot be created automatically for a vulnerability despite the availability of remediation</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2022-42004</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In FasterXML jackson-databind before 2.13.4, resource exhaustion can occur because of a lack of a check in BeanDeserializer._deserializeFromArray to prevent use of deeply nested arrays. An application is vulnerable only with certain customized choices for deserialization.
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-42004>CVE-2022-42004</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.13.4</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2022-42003</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.store.mongo.admin.metrics.QueryMetricsMongo (Application)
-> com.fasterxml.jackson.databind.ObjectMapper (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.std.ThrowableDeserializer (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerBase (Extension)
-> ❌ com.fasterxml.jackson.databind.deser.std.StdDeserializer (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
In FasterXML jackson-databind before 2.14.0-rc1, resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting, when the UNWRAP_SINGLE_VALUE_ARRAYS feature is enabled. Additional fix version in 2.13.4.1 and 2.12.17.1
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-42003>CVE-2022-42003</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.7.1,2.13.4.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2020-36518</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.store.mongo.admin.metrics.QueryMetricsMongo (Application)
-> com.fasterxml.jackson.databind.ObjectMapper (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer (Extension)
-> ❌ com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.
Mend Note: After conducting further research, Mend has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.
<p>Publish Date: 2022-03-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36518>CVE-2020-36518</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-03-11</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.6.1,2.13.2.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2021-46877</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.server.pure.model.context.services.PureModelContextServiceImpl (Application)
-> com.fasterxml.jackson.databind.json.JsonMapper (Extension)
-> com.fasterxml.jackson.databind.node.ArrayNode (Extension)
-> com.fasterxml.jackson.databind.node.BaseJsonNode (Extension)
-> ❌ com.fasterxml.jackson.databind.node.NodeSerialization (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
jackson-databind 2.10.x through 2.12.x before 2.12.6 and 2.13.x before 2.13.1 allows attackers to cause a denial of service (2 GB transient heap usage per read) in uncommon situations involving JsonNode JDK serialization.
<p>Publish Date: 2023-03-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-46877>CVE-2021-46877</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2021-46877">https://www.cve.org/CVERecord?id=CVE-2021-46877</a></p>
<p>Release Date: 2023-03-18</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.6,2.13.1</p>
</p>
<p></p>
</details>
|
True
|
legend-depot-store-mongo-1.7.6-SNAPSHOT.jar: 4 vulnerabilities (highest severity is: 7.5) reachable - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>legend-depot-store-mongo-1.7.6-SNAPSHOT.jar</b></p></summary>
<p></p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (legend-depot-store-mongo version) | Remediation Possible** | Reachability |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | --- |
| [CVE-2022-42004](https://www.mend.io/vulnerability-database/CVE-2022-42004) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌ |
| [CVE-2022-42003](https://www.mend.io/vulnerability-database/CVE-2022-42003) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
| [CVE-2020-36518](https://www.mend.io/vulnerability-database/CVE-2020-36518) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
| [CVE-2021-46877](https://www.mend.io/vulnerability-database/CVE-2021-46877) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | jackson-databind-2.10.5.1.jar | Transitive | N/A* | ❌|<p align="center"><a href="#">[<img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20>](## 'The vulnerability is likely to be reachable.')</a></p> |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p><p>**In some cases, Remediation PR cannot be created automatically for a vulnerability despite the availability of remediation</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2022-42004</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In FasterXML jackson-databind before 2.13.4, resource exhaustion can occur because of a lack of a check in BeanDeserializer._deserializeFromArray to prevent use of deeply nested arrays. An application is vulnerable only with certain customized choices for deserialization.
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-42004>CVE-2022-42004</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.13.4</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2022-42003</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.store.mongo.admin.metrics.QueryMetricsMongo (Application)
-> com.fasterxml.jackson.databind.ObjectMapper (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.std.ThrowableDeserializer (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerBase (Extension)
-> ❌ com.fasterxml.jackson.databind.deser.std.StdDeserializer (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
In FasterXML jackson-databind before 2.14.0-rc1, resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting, when the UNWRAP_SINGLE_VALUE_ARRAYS feature is enabled. Additional fix version in 2.13.4.1 and 2.12.17.1
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-42003>CVE-2022-42003</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.7.1,2.13.4.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2020-36518</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.store.mongo.admin.metrics.QueryMetricsMongo (Application)
-> com.fasterxml.jackson.databind.ObjectMapper (Extension)
-> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory (Extension)
-> com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer (Extension)
-> ❌ com.fasterxml.jackson.databind.deser.std.UntypedObjectDeserializer$Vanilla (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.
Mend Note: After conducting further research, Mend has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.
<p>Publish Date: 2022-03-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36518>CVE-2020-36518</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-03-11</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.6.1,2.13.2.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> CVE-2021-46877</summary>
### Vulnerable Library - <b>jackson-databind-2.10.5.1.jar</b></p>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /legend-depot-artifacts-purge/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar</p>
<p>
Dependency Hierarchy:
- legend-depot-store-mongo-1.7.6-SNAPSHOT.jar (Root Library)
- legend-depot-model-1.7.6-SNAPSHOT.jar
- legend-engine-protocol-pure-4.4.5.jar
- legend-engine-protocol-4.4.5.jar
- :x: **jackson-databind-2.10.5.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Reachability Analysis
<p>
This vulnerability is potentially used
```
org.finos.legend.depot.server.pure.model.context.services.PureModelContextServiceImpl (Application)
-> com.fasterxml.jackson.databind.json.JsonMapper (Extension)
-> com.fasterxml.jackson.databind.node.ArrayNode (Extension)
-> com.fasterxml.jackson.databind.node.BaseJsonNode (Extension)
-> ❌ com.fasterxml.jackson.databind.node.NodeSerialization (Vulnerable Component)
```
</p>
<p></p>
### Vulnerability Details
<p>
jackson-databind 2.10.x through 2.12.x before 2.12.6 and 2.13.x before 2.13.1 allows attackers to cause a denial of service (2 GB transient heap usage per read) in uncommon situations involving JsonNode JDK serialization.
<p>Publish Date: 2023-03-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-46877>CVE-2021-46877</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2021-46877">https://www.cve.org/CVERecord?id=CVE-2021-46877</a></p>
<p>Release Date: 2023-03-18</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.6,2.13.1</p>
</p>
<p></p>
</details>
|
non_process
|
legend depot store mongo snapshot jar vulnerabilities highest severity is reachable vulnerable library legend depot store mongo snapshot jar path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar vulnerabilities cve severity cvss dependency type fixed in legend depot store mongo version remediation possible reachability high jackson databind jar transitive n a high jackson databind jar transitive n a the vulnerability is likely to be reachable high jackson databind jar transitive n a the vulnerability is likely to be reachable high jackson databind jar transitive n a the vulnerability is likely to be reachable for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed in some cases remediation pr cannot be created automatically for a vulnerability despite the availability of remediation details cve vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file legend depot artifacts purge pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy legend depot store mongo snapshot jar root library legend depot model snapshot jar legend engine protocol pure jar legend engine protocol jar x jackson databind jar vulnerable library found in base branch master vulnerability details in fasterxml jackson databind before resource exhaustion can occur because of a lack of a check in beandeserializer deserializefromarray to prevent use of deeply nested arrays an application is vulnerable only with certain customized choices for deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind cve vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file legend depot artifacts purge pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy legend depot store mongo snapshot jar root library legend depot model snapshot jar legend engine protocol pure jar legend engine protocol jar x jackson databind jar vulnerable library found in base branch master reachability analysis this vulnerability is potentially used org finos legend depot store mongo admin metrics querymetricsmongo application com fasterxml jackson databind objectmapper extension com fasterxml jackson databind deser beandeserializerfactory extension com fasterxml jackson databind deser std throwabledeserializer extension com fasterxml jackson databind deser beandeserializerbase extension ❌ com fasterxml jackson databind deser std stddeserializer vulnerable component vulnerability details in fasterxml jackson databind before resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting when the unwrap single value arrays feature is enabled additional fix version in and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind cve vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file legend depot artifacts purge pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy legend depot store mongo snapshot jar root library legend depot model snapshot jar legend engine protocol pure jar legend engine protocol jar x jackson databind jar vulnerable library found in base branch master reachability analysis this vulnerability is potentially used org finos legend depot store mongo admin metrics querymetricsmongo application com fasterxml jackson databind objectmapper extension com fasterxml jackson databind deser beandeserializerfactory extension com fasterxml jackson databind deser basicdeserializerfactory extension com fasterxml jackson databind deser std untypedobjectdeserializer extension ❌ com fasterxml jackson databind deser std untypedobjectdeserializer vanilla vulnerable component vulnerability details jackson databind before allows a java stackoverflow exception and denial of service via a large depth of nested objects mend note after conducting further research mend has determined that all versions of com fasterxml jackson core jackson databind up to version are vulnerable to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind cve vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file legend depot artifacts purge pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy legend depot store mongo snapshot jar root library legend depot model snapshot jar legend engine protocol pure jar legend engine protocol jar x jackson databind jar vulnerable library found in base branch master reachability analysis this vulnerability is potentially used org finos legend depot server pure model context services puremodelcontextserviceimpl application com fasterxml jackson databind json jsonmapper extension com fasterxml jackson databind node arraynode extension com fasterxml jackson databind node basejsonnode extension ❌ com fasterxml jackson databind node nodeserialization vulnerable component vulnerability details jackson databind x through x before and x before allows attackers to cause a denial of service gb transient heap usage per read in uncommon situations involving jsonnode jdk serialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind
| 0
|
70,282
| 3,321,973,250
|
IssuesEvent
|
2015-11-09 12:02:27
|
cs2103aug2015-t14-2j/main
|
https://api.github.com/repos/cs2103aug2015-t14-2j/main
|
closed
|
Keyboard should autofocus on the textinput when GUI keystroke is pressed
|
priority.low type.enhancement type.story
|
Currently user have to click on the text input after LSHIFT AND RSHIFT are pressed before they can type.
This is for user-friendly perspective.
|
1.0
|
Keyboard should autofocus on the textinput when GUI keystroke is pressed - Currently user have to click on the text input after LSHIFT AND RSHIFT are pressed before they can type.
This is for user-friendly perspective.
|
non_process
|
keyboard should autofocus on the textinput when gui keystroke is pressed currently user have to click on the text input after lshift and rshift are pressed before they can type this is for user friendly perspective
| 0
|
77,497
| 9,582,989,709
|
IssuesEvent
|
2019-05-08 03:14:38
|
itmo-cet-sem/mmorts
|
https://api.github.com/repos/itmo-cet-sem/mmorts
|
opened
|
Define client-server protocol using JSON Schema
|
design
|
Website: https://json-schema.org
Why we should?
1) It will standardize the protocol.
2) We could use code generators to simplify the process of implementing encoders/decoders on both client and server sides. For example, this generator supports both Python and C#: https://app.quicktype.io.
|
1.0
|
Define client-server protocol using JSON Schema - Website: https://json-schema.org
Why we should?
1) It will standardize the protocol.
2) We could use code generators to simplify the process of implementing encoders/decoders on both client and server sides. For example, this generator supports both Python and C#: https://app.quicktype.io.
|
non_process
|
define client server protocol using json schema website why we should it will standardize the protocol we could use code generators to simplify the process of implementing encoders decoders on both client and server sides for example this generator supports both python and c
| 0
|
63,529
| 3,196,964,181
|
IssuesEvent
|
2015-10-01 00:04:08
|
fusioninventory/fusioninventory-for-glpi
|
https://api.github.com/repos/fusioninventory/fusioninventory-for-glpi
|
closed
|
sql error on location and collect rule
|
Category: Rules Component: For junior contributor Component: Found in version Priority: Normal Status: Resolved Tracker: Bug
|
---
Author Name: **David Durieux** (@ddurieux)
Original Redmine Issue: 2825, http://forge.fusioninventory.org/issues/2825
Original Date: 2014-11-28
Original Assignee: David Durieux
---
None
|
1.0
|
sql error on location and collect rule - ---
Author Name: **David Durieux** (@ddurieux)
Original Redmine Issue: 2825, http://forge.fusioninventory.org/issues/2825
Original Date: 2014-11-28
Original Assignee: David Durieux
---
None
|
non_process
|
sql error on location and collect rule author name david durieux ddurieux original redmine issue original date original assignee david durieux none
| 0
|
199,768
| 15,058,305,746
|
IssuesEvent
|
2021-02-03 23:11:56
|
projectcontour/contour
|
https://api.github.com/repos/projectcontour/contour
|
opened
|
come up with a plan for test improvements
|
area/testing lifecycle/needs-triage
|
We've talked about various ways that Contour's testing could improve. This is a meta-issue to do some R&D and come up with a proposal for a concrete set of improvements to make.
|
1.0
|
come up with a plan for test improvements - We've talked about various ways that Contour's testing could improve. This is a meta-issue to do some R&D and come up with a proposal for a concrete set of improvements to make.
|
non_process
|
come up with a plan for test improvements we ve talked about various ways that contour s testing could improve this is a meta issue to do some r d and come up with a proposal for a concrete set of improvements to make
| 0
|
20,532
| 27,190,076,053
|
IssuesEvent
|
2023-02-19 17:42:38
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
opened
|
rework the final-scale and make it possible to have a sharpen after downscaling
|
feature: enhancement priority: high scope: image processing scope: color management release notes: pending
|
For the final-scale there is discussion to happen here.
For the final sharpen there is also discussion to happen here.
I had this proposal:
- Introduce a new hidden module named finalsharpen
- Make this module disabled by default
- This new module has a single parameter strength to allow for setting the amount of sharpening (0 = None)
- A new slider "final sharpening" is introduced into the export module (0 by default) up to (100%)
- When this slider is > 0 during export the finalsharpen module is enabled
- The actual strength is passed to it using config key
This issue is to track down the discussion for the implementation of those two issues. Please keep this discussion focused.
|
1.0
|
rework the final-scale and make it possible to have a sharpen after downscaling - For the final-scale there is discussion to happen here.
For the final sharpen there is also discussion to happen here.
I had this proposal:
- Introduce a new hidden module named finalsharpen
- Make this module disabled by default
- This new module has a single parameter strength to allow for setting the amount of sharpening (0 = None)
- A new slider "final sharpening" is introduced into the export module (0 by default) up to (100%)
- When this slider is > 0 during export the finalsharpen module is enabled
- The actual strength is passed to it using config key
This issue is to track down the discussion for the implementation of those two issues. Please keep this discussion focused.
|
process
|
rework the final scale and make it possible to have a sharpen after downscaling for the final scale there is discussion to happen here for the final sharpen there is also discussion to happen here i had this proposal introduce a new hidden module named finalsharpen make this module disabled by default this new module has a single parameter strength to allow for setting the amount of sharpening none a new slider final sharpening is introduced into the export module by default up to when this slider is during export the finalsharpen module is enabled the actual strength is passed to it using config key this issue is to track down the discussion for the implementation of those two issues please keep this discussion focused
| 1
|
14,831
| 18,168,771,484
|
IssuesEvent
|
2021-09-27 17:22:36
|
2i2c-org/team-compass
|
https://api.github.com/repos/2i2c-org/team-compass
|
reopened
|
Re-purpose our `prio` labels to be `impact` instead
|
:label: team-process type: task waiting
|
### Description
I propose that we remove the `prio:` GitHub labels and instead replace them with labels based on **impact** (so, `impact: low / med / high`).
We currently use a few GitHub labels to encode "priority": `prio:low / med / high`. I have found that these labels are under-specified. Priority is very context- and person-dependent, and also changes fairly frequently (e.g. what is low-priority yesterday may be high-priority tomorrow). Moreover, we currently have two different ways to encode priority: one is with the label, and the other is via an issue's placement in our development backlog / ordering on a column.
Instead I think we should use "impact" and define the meaning of this label specific to an issue's functional area. For example:
- the impact of an infrastructure feature might be defined by the number of users that would benefit from it (all users == high, ~half users == med, ~quarter users == low) OR particularly important users could trigger "high" as well.
- the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic, or whether it is complex to reproduce or is reproduced by everyone
- the impact of an administrative task might be defined by whether it is required to get some essential work done, or just general housekeeping
There would still be a degree of subjectivity and judgment there, but I think this is still more concrete than "priority". In addition, the person that applies an "impact" label should ensure that the value/benefit is described in the issue well enough to justify the label.
### Value / benefit
If we instead used "impact" labels, I believe that it would be easier to have a concrete definition for each issue that would be more stable over time. Moreover, impact maps relatively nicely onto our "value/benefit/user story" practice in describing our team issues. Impact would be an important deciding factor in prioritization (in general, we want to work on things that are high-impact!) but it wouldn't **define** the priority, it would just be an important factor for it.
### Implementation details
I propose that we try this out on the `pilot-hubs/` repository for a month and see if it improves the signal-to-noise of the issues, or helps us triage and prioritize more easily.
If nobody objects, I'll plan to remove all `prio:` labels from pilot-hubs, and replace them with "impact" labels (and potentially re-assign labels to issues as needed).
### How to define "impact"?
Here is the definition we can use for impact across various kinds of categories:
- **Features / Enhancements**
- `impact: high`: Will be seen and commonly used by nearly all users OR has been requested by an abnormally large number of users OR is of particular importance to a really important user
- `impact: med`: Will be useful to many users but not an overwhelming amount. Most issues should be in this category.
- `impact: low`: Is useful but not a critical part of workflows OR is a niche use-case that only a few users may need.
- **Bugs**
- `impact: high`: Is disruptive to nearly all users, or critically disruptive to many users (e.g., sessions won't work at all)
- `impact: med`: Is disruptive to our users, but not in a critical way. Most issues should be in this category.
- `impact: low`: Is cosmetic or minimally disruptive, or only affects a small number of users.
- **Team process or admin**
- `impact: high`: Is a crucial component of our team process that will have across-the-board improvements in our practices
- `impact: med`: Is important but not a critical part of our workflow.
- `impact: low`: Needs to be done but is not going to critically affect our team practices any time soon.
### Tasks to complete
- [x] Nobody objects to trying this workflow in pilot-hubs/
- [x] Swap out `prio:` labels for `impact:` labels in `pilot-hubs/`
- [ ] Revisit this practice in October and decide if we should continue it
- [ ] Decide whether to apply it to other repositories
- [ ] Write up in the team compass
### Updates
_No response_
|
1.0
|
Re-purpose our `prio` labels to be `impact` instead - ### Description
I propose that we remove the `prio:` GitHub labels and instead replace them with labels based on **impact** (so, `impact: low / med / high`).
We currently use a few GitHub labels to encode "priority": `prio:low / med / high`. I have found that these labels are under-specified. Priority is very context- and person-dependent, and also changes fairly frequently (e.g. what is low-priority yesterday may be high-priority tomorrow). Moreover, we currently have two different ways to encode priority: one is with the label, and the other is via an issue's placement in our development backlog / ordering on a column.
Instead I think we should use "impact" and define the meaning of this label specific to an issue's functional area. For example:
- the impact of an infrastructure feature might be defined by the number of users that would benefit from it (all users == high, ~half users == med, ~quarter users == low) OR particularly important users could trigger "high" as well.
- the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic, or whether it is complex to reproduce or is reproduced by everyone
- the impact of an administrative task might be defined by whether it is required to get some essential work done, or just general housekeeping
There would still be a degree of subjectivity and judgment there, but I think this is still more concrete than "priority". In addition, the person that applies an "impact" label should ensure that the value/benefit is described in the issue well enough to justify the label.
### Value / benefit
If we instead used "impact" labels, I believe that it would be easier to have a concrete definition for each issue that would be more stable over time. Moreover, impact maps relatively nicely onto our "value/benefit/user story" practice in describing our team issues. Impact would be an important deciding factor in prioritization (in general, we want to work on things that are high-impact!) but it wouldn't **define** the priority, it would just be an important factor for it.
### Implementation details
I propose that we try this out on the `pilot-hubs/` repository for a month and see if it improves the signal-to-noise of the issues, or helps us triage and prioritize more easily.
If nobody objects, I'll plan to remove all `prio:` labels from pilot-hubs, and replace them with "impact" labels (and potentially re-assign labels to issues as needed).
### How to define "impact"?
Here is the definition we can use for impact across various kinds of categories:
- **Features / Enhancements**
- `impact: high`: Will be seen and commonly used by nearly all users OR has been requested by an abnormally large number of users OR is of particular importance to a really important user
- `impact: med`: Will be useful to many users but not an overwhelming amount. Most issues should be in this category.
- `impact: low`: Is useful but not a critical part of workflows OR is a niche use-case that only a few users may need.
- **Bugs**
- `impact: high`: Is disruptive to nearly all users, or critically disruptive to many users (e.g., sessions won't work at all)
- `impact: med`: Is disruptive to our users, but not in a critical way. Most issues should be in this category.
- `impact: low`: Is cosmetic or minimally disruptive, or only affects a small number of users.
- **Team process or admin**
- `impact: high`: Is a crucial component of our team process that will have across-the-board improvements in our practices
- `impact: med`: Is important but not a critical part of our workflow.
- `impact: low`: Needs to be done but is not going to critically affect our team practices any time soon.
### Tasks to complete
- [x] Nobody objects to trying this workflow in pilot-hubs/
- [x] Swap out `prio:` labels for `impact:` labels in `pilot-hubs/`
- [ ] Revisit this practice in October and decide if we should continue it
- [ ] Decide whether to apply it to other repositories
- [ ] Write up in the team compass
### Updates
_No response_
|
process
|
re purpose our prio labels to be impact instead description i propose that we remove the prio github labels and instead replace them with labels based on impact so impact low med high we currently use a few github labels to encode priority prio low med high i have found that these labels are under specified priority is very context and person dependent and also changes fairly frequently e g what is low priority yesterday may be high priority tomorrow moreover we currently have two different ways to encode priority one is with the label and the other is via an issue s placement in our development backlog ordering on a column instead i think we should use impact and define the meaning of this label specific to an issue s functional area for example the impact of an infrastructure feature might be defined by the number of users that would benefit from it all users high half users med quarter users low or particularly important users could trigger high as well the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic or whether it is complex to reproduce or is reproduced by everyone the impact of an administrative task might be defined by whether it is required to get some essential work done or just general housekeeping there would still be a degree of subjectivity and judgment there but i think this is still more concrete than priority in addition the person that applies an impact label should ensure that the value benefit is described in the issue well enough to justify the label value benefit if we instead used impact labels i believe that it would be easier to have a concrete definition for each issue that would be more stable over time moreover impact maps relatively nicely onto our value benefit user story practice in describing our team issues impact would be an important deciding factor in prioritization in general we want to work on things that are high impact but it wouldn t define the priority it would just be an important factor for it implementation details i propose that we try this out on the pilot hubs repository for a month and see if it improves the signal to noise of the issues or helps us triage and prioritize more easily if nobody objects i ll plan to remove all prio labels from pilot hubs and replace them with impact labels and potentially re assign labels to issues as needed how to define impact here is the definition we can use for impact across various kinds of categories features enhancements impact high will be seen and commonly used by nearly all users or has been requested by an abnormally large number of users or is of particular importance to a really important user impact med will be useful to many users but not an overwhelming amount most issues should be in this category impact low is useful but not a critical part of workflows or is a niche use case that only a few users may need bugs impact high is disruptive to nearly all users or critically disruptive to many users e g sessions won t work at all impact med is disruptive to our users but not in a critical way most issues should be in this category impact low is cosmetic or minimally disruptive or only affects a small number of users team process or admin impact high is a crucial component of our team process that will have across the board improvements in our practices impact med is important but not a critical part of our workflow impact low needs to be done but is not going to critically affect our team practices any time soon tasks to complete nobody objects to trying this workflow in pilot hubs swap out prio labels for impact labels in pilot hubs revisit this practice in october and decide if we should continue it decide whether to apply it to other repositories write up in the team compass updates no response
| 1
|
254,435
| 19,214,080,844
|
IssuesEvent
|
2021-12-07 07:22:56
|
ErnWong/crystalorb
|
https://api.github.com/repos/ErnWong/crystalorb
|
closed
|
Update readme with examples and related libraries
|
documentation
|
As a rust game developer interested in netcode programming,
I want the readme to have updated links to other related rust netcode projects
so I get a better awareness of the community, and can better understand which project is most suitable for my next game.
---
- https://github.com/gschup/ggrs
- https://github.com/HouraiTeahouse/backroll-rs
- https://github.com/jamescarterbell/bevy_rollback
- https://github.com/arcana-engine/evoke
- https://github.com/smokku/soldank/tree/581b4f446b2cf5264f4c25f4cc2eaa1c0bfc192a/shared/src/orb
(from #5)
- https://github.com/vilcans/orbgame
(from #12)
- https://github.com/ErnWong/dango-tribute
- https://github.com/hastearcade/snowglobe
|
1.0
|
Update readme with examples and related libraries - As a rust game developer interested in netcode programming,
I want the readme to have updated links to other related rust netcode projects
so I get a better awareness of the community, and can better understand which project is most suitable for my next game.
---
- https://github.com/gschup/ggrs
- https://github.com/HouraiTeahouse/backroll-rs
- https://github.com/jamescarterbell/bevy_rollback
- https://github.com/arcana-engine/evoke
- https://github.com/smokku/soldank/tree/581b4f446b2cf5264f4c25f4cc2eaa1c0bfc192a/shared/src/orb
(from #5)
- https://github.com/vilcans/orbgame
(from #12)
- https://github.com/ErnWong/dango-tribute
- https://github.com/hastearcade/snowglobe
|
non_process
|
update readme with examples and related libraries as a rust game developer interested in netcode programming i want the readme to have updated links to other related rust netcode projects so i get a better awareness of the community and can better understand which project is most suitable for my next game from from
| 0
|
94,353
| 15,962,365,514
|
IssuesEvent
|
2021-04-16 01:09:20
|
KaterinaOrg/my-bag-of-holding
|
https://api.github.com/repos/KaterinaOrg/my-bag-of-holding
|
opened
|
WS-2017-0247 (Low) detected in ms-0.7.1.tgz
|
security vulnerability
|
## WS-2017-0247 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ms-0.7.1.tgz</b></p></summary>
<p>Tiny ms conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.1.tgz">https://registry.npmjs.org/ms/-/ms-0.7.1.tgz</a></p>
<p>Path to dependency file: my-bag-of-holding/package.json</p>
<p>Path to vulnerable library: my-bag-of-holding/node_modules/mocha/node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-1.3.3.tgz (Root Library)
- node-sass-2.1.1.tgz
- mocha-2.5.3.tgz
- debug-2.2.0.tgz
- :x: **ms-0.7.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).
<p>Publish Date: 2017-04-12
<p>URL: <a href=https://github.com/zeit/ms/commit/305f2ddcd4eff7cc7c518aca6bb2b2d2daad8fef>WS-2017-0247</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>3.4</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/vercel/ms/pull/89">https://github.com/vercel/ms/pull/89</a></p>
<p>Release Date: 2017-04-12</p>
<p>Fix Resolution: 2.1.1</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ms","packageVersion":"0.7.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:1.3.3;node-sass:2.1.1;mocha:2.5.3;debug:2.2.0;ms:0.7.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.1.1"}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2017-0247","vulnerabilityDetails":"Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).","vulnerabilityUrl":"https://github.com/zeit/ms/commit/305f2ddcd4eff7cc7c518aca6bb2b2d2daad8fef","cvss2Severity":"low","cvss2Score":"3.4","extraData":{}}</REMEDIATE> -->
|
True
|
WS-2017-0247 (Low) detected in ms-0.7.1.tgz - ## WS-2017-0247 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ms-0.7.1.tgz</b></p></summary>
<p>Tiny ms conversion utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/ms/-/ms-0.7.1.tgz">https://registry.npmjs.org/ms/-/ms-0.7.1.tgz</a></p>
<p>Path to dependency file: my-bag-of-holding/package.json</p>
<p>Path to vulnerable library: my-bag-of-holding/node_modules/mocha/node_modules/ms/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-1.3.3.tgz (Root Library)
- node-sass-2.1.1.tgz
- mocha-2.5.3.tgz
- debug-2.2.0.tgz
- :x: **ms-0.7.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).
<p>Publish Date: 2017-04-12
<p>URL: <a href=https://github.com/zeit/ms/commit/305f2ddcd4eff7cc7c518aca6bb2b2d2daad8fef>WS-2017-0247</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>3.4</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/vercel/ms/pull/89">https://github.com/vercel/ms/pull/89</a></p>
<p>Release Date: 2017-04-12</p>
<p>Fix Resolution: 2.1.1</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ms","packageVersion":"0.7.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:1.3.3;node-sass:2.1.1;mocha:2.5.3;debug:2.2.0;ms:0.7.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.1.1"}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2017-0247","vulnerabilityDetails":"Affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS).","vulnerabilityUrl":"https://github.com/zeit/ms/commit/305f2ddcd4eff7cc7c518aca6bb2b2d2daad8fef","cvss2Severity":"low","cvss2Score":"3.4","extraData":{}}</REMEDIATE> -->
|
non_process
|
ws low detected in ms tgz ws low severity vulnerability vulnerable library ms tgz tiny ms conversion utility library home page a href path to dependency file my bag of holding package json path to vulnerable library my bag of holding node modules mocha node modules ms package json dependency hierarchy gulp sass tgz root library node sass tgz mocha tgz debug tgz x ms tgz vulnerable library found in base branch main vulnerability details affected versions of this package are vulnerable to regular expression denial of service redos publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp sass node sass mocha debug ms isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails affected versions of this package are vulnerable to regular expression denial of service redos vulnerabilityurl
| 0
|
14,475
| 17,596,685,414
|
IssuesEvent
|
2021-08-17 06:31:35
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[iOS] Issues related to 'Text' response type validation in iOS platform
|
Bug P2 iOS Process: Fixed Process: Tested QA Process: Tested dev
|
Steps:
1. Configure a 'text' response type for below scenarios
2. Publish updates from SB
3. Check the validation from iOS mobile
SB Configuration for 'Text' | Actual result in iOS | Expected result
-- | -- | --
Allow numbers | a) Allowing combination of Alphanumeric b) Allowing combination of numeric-special characters c) Allowing combination of alphanumeric-special characters | Should accept only numbers
Allow alphabets | a) Allowing combination of Alphanumeric b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only alphabets
Allow alphabets and numbers | a) Allowing combination of numeric-special characters b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only combination of alphanumeric
Allow special characters | a) Allowing combination of numeric-special characters b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only special characters
Issue observed for both Question step and form step
Issue not observed for 'Disallow' scenarios and other scenarios
Issue not observed in 'Android' platform
|
3.0
|
[iOS] Issues related to 'Text' response type validation in iOS platform - Steps:
1. Configure a 'text' response type for below scenarios
2. Publish updates from SB
3. Check the validation from iOS mobile
SB Configuration for 'Text' | Actual result in iOS | Expected result
-- | -- | --
Allow numbers | a) Allowing combination of Alphanumeric b) Allowing combination of numeric-special characters c) Allowing combination of alphanumeric-special characters | Should accept only numbers
Allow alphabets | a) Allowing combination of Alphanumeric b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only alphabets
Allow alphabets and numbers | a) Allowing combination of numeric-special characters b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only combination of alphanumeric
Allow special characters | a) Allowing combination of numeric-special characters b) Allowing combination of alphabet-special characters c) Allowing combination of alphanumeric-special characters | Should accept only special characters
Issue observed for both Question step and form step
Issue not observed for 'Disallow' scenarios and other scenarios
Issue not observed in 'Android' platform
|
process
|
issues related to text response type validation in ios platform steps configure a text response type for below scenarios publish updates from sb check the validation from ios mobile sb configuration for text actual result in ios expected result allow numbers a allowing combination of alphanumeric b allowing combination of numeric special characters c allowing combination of alphanumeric special characters should accept only numbers allow alphabets a allowing combination of alphanumeric b allowing combination of alphabet special characters c allowing combination of alphanumeric special characters should accept only alphabets allow alphabets and numbers a allowing combination of numeric special characters b allowing combination of alphabet special characters c allowing combination of alphanumeric special characters should accept only combination of alphanumeric allow special characters a allowing combination of numeric special characters b allowing combination of alphabet special characters c allowing combination of alphanumeric special characters should accept only special characters issue observed for both question step and form step issue not observed for disallow scenarios and other scenarios issue not observed in android platform
| 1
|
2,886
| 5,849,148,110
|
IssuesEvent
|
2017-05-10 22:50:23
|
ncbo/bioportal-project
|
https://api.github.com/repos/ncbo/bioportal-project
|
closed
|
DMTO: submission 1.0 failed to parse
|
in progress ontology processing problem
|
Version 1.0 of the [Diabetes Mellitus Treatment Ontology](http://bioportal.bioontology.org/ontologies/DMTO) (DMTO) failed to parse.
Pertinent snippet from the parsing log file:
```
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Parse result: false
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Output triples in: {}/srv/ncbo/repository/DMTO/2/owlapi.xrdf
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Finished parsing!
I, [2017-03-30T16:23:53.092004 #28987] INFO -- : OWLAPI Java command: parsing finished successfully.
E, [2017-03-30T16:23:53.092323 #28987] ERROR -- : LinkedData::Parser::OWLAPIParserException: OWLAPI java command exited with 0. Output file /srv/ncbo/repository/DMTO/2/owlapi.xrdf cannot be found.
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:125:in `block in call_owlapi_java_command'
/usr/local/rbenv/versions/2.3.3/lib/ruby/2.3.0/open3.rb:205:in `popen_run'
/usr/local/rbenv/versions/2.3.3/lib/ruby/2.3.0/open3.rb:95:in `popen3'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:76:in `call_owlapi_java_command'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:153:in `parse'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/models/ontology_submission.rb:430:in `generate_rdf'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/models/ontology_submission.rb:860:in `process_submission'
/srv/ncbo/ncbo_cron/lib/ncbo_cron/ontology_submission_parser.rb:178:in `process_submission'
./bin/ncbo_ontology_process:98:in `block in <main>'
./bin/ncbo_ontology_process:81:in `each'
./bin/ncbo_ontology_process:81:in `<main>'
```
|
1.0
|
DMTO: submission 1.0 failed to parse - Version 1.0 of the [Diabetes Mellitus Treatment Ontology](http://bioportal.bioontology.org/ontologies/DMTO) (DMTO) failed to parse.
Pertinent snippet from the parsing log file:
```
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Parse result: false
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Output triples in: {}/srv/ncbo/repository/DMTO/2/owlapi.xrdf
2017-03-30T16:23:53 [main] INFO o.s.n.o.OntologyParserCommand - Finished parsing!
I, [2017-03-30T16:23:53.092004 #28987] INFO -- : OWLAPI Java command: parsing finished successfully.
E, [2017-03-30T16:23:53.092323 #28987] ERROR -- : LinkedData::Parser::OWLAPIParserException: OWLAPI java command exited with 0. Output file /srv/ncbo/repository/DMTO/2/owlapi.xrdf cannot be found.
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:125:in `block in call_owlapi_java_command'
/usr/local/rbenv/versions/2.3.3/lib/ruby/2.3.0/open3.rb:205:in `popen_run'
/usr/local/rbenv/versions/2.3.3/lib/ruby/2.3.0/open3.rb:95:in `popen3'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:76:in `call_owlapi_java_command'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/parser/owlapi.rb:153:in `parse'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/models/ontology_submission.rb:430:in `generate_rdf'
/srv/ncbo/ncbo_cron/vendor/bundle/ruby/2.3.0/bundler/gems/ontologies_linked_data-9fb051182d58/lib/ontologies_linked_data/models/ontology_submission.rb:860:in `process_submission'
/srv/ncbo/ncbo_cron/lib/ncbo_cron/ontology_submission_parser.rb:178:in `process_submission'
./bin/ncbo_ontology_process:98:in `block in <main>'
./bin/ncbo_ontology_process:81:in `each'
./bin/ncbo_ontology_process:81:in `<main>'
```
|
process
|
dmto submission failed to parse version of the dmto failed to parse pertinent snippet from the parsing log file info o s n o ontologyparsercommand parse result false info o s n o ontologyparsercommand output triples in srv ncbo repository dmto owlapi xrdf info o s n o ontologyparsercommand finished parsing i info owlapi java command parsing finished successfully e error linkeddata parser owlapiparserexception owlapi java command exited with output file srv ncbo repository dmto owlapi xrdf cannot be found srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data parser owlapi rb in block in call owlapi java command usr local rbenv versions lib ruby rb in popen run usr local rbenv versions lib ruby rb in srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data parser owlapi rb in call owlapi java command srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data parser owlapi rb in parse srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in generate rdf srv ncbo ncbo cron vendor bundle ruby bundler gems ontologies linked data lib ontologies linked data models ontology submission rb in process submission srv ncbo ncbo cron lib ncbo cron ontology submission parser rb in process submission bin ncbo ontology process in block in bin ncbo ontology process in each bin ncbo ontology process in
| 1
|
11,391
| 14,226,733,670
|
IssuesEvent
|
2020-11-17 23:39:10
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
opened
|
Create RFC for Compliance reporting coverage
|
p1 team:data processing
|
### Description
Create RFC for Compliance reporting for various frameworks (MITRE, CIS, PCI, HIPAA, etc). This will help our users understand how much coverage they are achieving for a given compliance.
### Acceptance Criteria
- Understanding of backend work required to feed data into heat map of framework
- Understanding of frontend work to display heat map in overview page
|
1.0
|
Create RFC for Compliance reporting coverage - ### Description
Create RFC for Compliance reporting for various frameworks (MITRE, CIS, PCI, HIPAA, etc). This will help our users understand how much coverage they are achieving for a given compliance.
### Acceptance Criteria
- Understanding of backend work required to feed data into heat map of framework
- Understanding of frontend work to display heat map in overview page
|
process
|
create rfc for compliance reporting coverage description create rfc for compliance reporting for various frameworks mitre cis pci hipaa etc this will help our users understand how much coverage they are achieving for a given compliance acceptance criteria understanding of backend work required to feed data into heat map of framework understanding of frontend work to display heat map in overview page
| 1
|
33,186
| 4,817,785,622
|
IssuesEvent
|
2016-11-04 14:41:59
|
Pliohub/Plio
|
https://api.github.com/repos/Pliohub/Plio
|
closed
|
[Testlio][Mobile][User Settings] The 'User Settings' window is not fully available in landscape mode, because there is no way to scroll the page to view all the options.
|
enhancement resolved testlio
|
**Environment:**
Device and OS: Nexus 5X Android 7.0
Testable App version: app.pliohub.com
Network: Wi-Fi
Location: Ukraine
Reproducibility rate: 3/3
**Steps to reproduce:**
1. Go to URL>Sign in.
2. Open the 'User Settings' window>Observe everything is displayed correctly.
3. Now change your device to landscape mode.
**Expected result:**
User should be able to use the 'User Settings' window in landscape mode, so all the options from there is should be available.
**Actual result:**
The 'User Settings' window is not fully displayed in landscape mode and there is no way to scroll the page down to access to all the options.<p><img src="https://testlio.s3.amazonaws.com/issue/145747/a/a63f794a-25b9-918d-abfd-77af1d4009bc-medium.jpg"/><br><a href="https://testlio.s3.amazonaws.com/issue/145747/a/73e686e1-dd21-0b2e-17e5-0b7cd9b280d2/50337547be535748d26f14dcae705d95.mp4">#26947_Vide.mp4</a><br></p>
|
1.0
|
[Testlio][Mobile][User Settings] The 'User Settings' window is not fully available in landscape mode, because there is no way to scroll the page to view all the options. - **Environment:**
Device and OS: Nexus 5X Android 7.0
Testable App version: app.pliohub.com
Network: Wi-Fi
Location: Ukraine
Reproducibility rate: 3/3
**Steps to reproduce:**
1. Go to URL>Sign in.
2. Open the 'User Settings' window>Observe everything is displayed correctly.
3. Now change your device to landscape mode.
**Expected result:**
User should be able to use the 'User Settings' window in landscape mode, so all the options from there is should be available.
**Actual result:**
The 'User Settings' window is not fully displayed in landscape mode and there is no way to scroll the page down to access to all the options.<p><img src="https://testlio.s3.amazonaws.com/issue/145747/a/a63f794a-25b9-918d-abfd-77af1d4009bc-medium.jpg"/><br><a href="https://testlio.s3.amazonaws.com/issue/145747/a/73e686e1-dd21-0b2e-17e5-0b7cd9b280d2/50337547be535748d26f14dcae705d95.mp4">#26947_Vide.mp4</a><br></p>
|
non_process
|
the user settings window is not fully available in landscape mode because there is no way to scroll the page to view all the options environment device and os nexus android testable app version app pliohub com network wi fi location ukraine reproducibility rate steps to reproduce go to url sign in open the user settings window observe everything is displayed correctly now change your device to landscape mode expected result user should be able to use the user settings window in landscape mode so all the options from there is should be available actual result the user settings window is not fully displayed in landscape mode and there is no way to scroll the page down to access to all the options img src href
| 0
|
184,309
| 14,286,868,531
|
IssuesEvent
|
2020-11-23 15:40:21
|
theupdateframework/tuf
|
https://api.github.com/repos/theupdateframework/tuf
|
closed
|
Add a method to create a server subprocesses only on unused local ports
|
testing
|
**Description of issue or feature request**:
It will be useful to make sure we are starting the server subprocesses only on unused local ports.
This will save us issues when the server is started on used local ports and given that more tests using server subprocesses could be added in the future, then we have even a better stimulus to create such a method.
It's preferable if we find a way to do this atomically e.g. check if the port is busy **and** start the server almost instantly because we want to minimize the possibility that that port could be taken between the port validation and server startup.
**Current behavior**:
There is no such functionality in TUF and sometimes I am getting `[Errno 98] Address already in use` or `ConnectionRefusedError: [Errno 111]`.
**Expected behavior**:
Hopefully, none of the above errors will ever appear after we add this functionality.
|
1.0
|
Add a method to create a server subprocesses only on unused local ports - **Description of issue or feature request**:
It will be useful to make sure we are starting the server subprocesses only on unused local ports.
This will save us issues when the server is started on used local ports and given that more tests using server subprocesses could be added in the future, then we have even a better stimulus to create such a method.
It's preferable if we find a way to do this atomically e.g. check if the port is busy **and** start the server almost instantly because we want to minimize the possibility that that port could be taken between the port validation and server startup.
**Current behavior**:
There is no such functionality in TUF and sometimes I am getting `[Errno 98] Address already in use` or `ConnectionRefusedError: [Errno 111]`.
**Expected behavior**:
Hopefully, none of the above errors will ever appear after we add this functionality.
|
non_process
|
add a method to create a server subprocesses only on unused local ports description of issue or feature request it will be useful to make sure we are starting the server subprocesses only on unused local ports this will save us issues when the server is started on used local ports and given that more tests using server subprocesses could be added in the future then we have even a better stimulus to create such a method it s preferable if we find a way to do this atomically e g check if the port is busy and start the server almost instantly because we want to minimize the possibility that that port could be taken between the port validation and server startup current behavior there is no such functionality in tuf and sometimes i am getting address already in use or connectionrefusederror expected behavior hopefully none of the above errors will ever appear after we add this functionality
| 0
|
587,478
| 17,617,237,724
|
IssuesEvent
|
2021-08-18 11:17:44
|
stackabletech/kafka-operator
|
https://api.github.com/repos/stackabletech/kafka-operator
|
closed
|
Add update functionality and status conditions
|
type/enhancement priority/high
|
The conditions are required for the TestCluster in `integration-test-commons` and as of now are used within the installation / update functionality.
This needs to be added in order be able to write proper integration tests.
|
1.0
|
Add update functionality and status conditions - The conditions are required for the TestCluster in `integration-test-commons` and as of now are used within the installation / update functionality.
This needs to be added in order be able to write proper integration tests.
|
non_process
|
add update functionality and status conditions the conditions are required for the testcluster in integration test commons and as of now are used within the installation update functionality this needs to be added in order be able to write proper integration tests
| 0
|
36,907
| 5,095,345,507
|
IssuesEvent
|
2017-01-03 14:56:36
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
github.com/cockroachdb/cockroach/pkg/sql/pgbench/cmd/pgbenchsetup: (unknown) failed under stress
|
Robot test-failure
|
SHA: https://github.com/cockroachdb/cockroach/commits/0e5edcff3f9f4f616f64a31f2020e665e411bedb
Parameters:
```
COCKROACH_PROPOSER_EVALUATED_KV=true
TAGS=
GOFLAGS=
```
Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=103516&tab=buildLog
```
docker: Error response from daemon: containerd: container did not start before the specified timeout.
```
|
1.0
|
github.com/cockroachdb/cockroach/pkg/sql/pgbench/cmd/pgbenchsetup: (unknown) failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/0e5edcff3f9f4f616f64a31f2020e665e411bedb
Parameters:
```
COCKROACH_PROPOSER_EVALUATED_KV=true
TAGS=
GOFLAGS=
```
Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=103516&tab=buildLog
```
docker: Error response from daemon: containerd: container did not start before the specified timeout.
```
|
non_process
|
github com cockroachdb cockroach pkg sql pgbench cmd pgbenchsetup unknown failed under stress sha parameters cockroach proposer evaluated kv true tags goflags stress build found a failed test docker error response from daemon containerd container did not start before the specified timeout
| 0
|
903
| 3,367,777,086
|
IssuesEvent
|
2015-11-22 13:33:02
|
pwittchen/kirai
|
https://api.github.com/repos/pwittchen/kirai
|
opened
|
Release 1.4.0
|
release process
|
**Initial release notes**:
- added support for formatting text in terminal (bold, underline, color and background color)
- created separate packages for classes responsible for html formatting and terminal formatting
- updated README.md and prepared more code samples
**Things to do**:
|
1.0
|
Release 1.4.0 - **Initial release notes**:
- added support for formatting text in terminal (bold, underline, color and background color)
- created separate packages for classes responsible for html formatting and terminal formatting
- updated README.md and prepared more code samples
**Things to do**:
|
process
|
release initial release notes added support for formatting text in terminal bold underline color and background color created separate packages for classes responsible for html formatting and terminal formatting updated readme md and prepared more code samples things to do
| 1
|
71,581
| 3,362,506,878
|
IssuesEvent
|
2015-11-20 06:23:34
|
xcat2/xcat-core
|
https://api.github.com/repos/xcat2/xcat-core
|
closed
|
[FVT]:2.11: xdsh <node> "configraid delete_raid=all" hangs when delete raid 0
|
priority:high status:pending type:bug
|
xCAT 2.11 on rh7.1
[root@~]# lsxcatd -v
Version 2.11 (git commit 2295560c5048801e80477a18e5267bcdb06a0814, built Thu Oct 15 19:30:47 EDT 2015)
[root@ ~]# xdsh p8euh diskdiscover
p8euh: --------------------------------------------------------------------------
p8euh: PCI_ID PCI_SLOT_NAME Resource_Path Device Description Status
p8euh: ------ ------------- ------------- ------ ----------- ----------------
p8euh: 1014:034a 0001:08:00.0 0:0:0:0 sg0 0 Array Member Active
p8euh: 1014:034a 0001:08:00.0 0:0:1:0 sg1 0 Array Member Active
p8euh: -------------------------------------------------------------------
p8euh: Get ipr RAID arrays by PCI_SLOT_NAME: 0001:08:00.0
p8euh: -------------------------------------------------------------------
p8euh: Name PCI/SCSI Location Description Status
p8euh: ------ ------------------------- ------------------------- -----------------
p8euh: sda 0001:08:00.0/0:2:0:0 RAID 0 Disk Array Optimized
[root@ ~]# xdsh p8euh "configraid delete_raid=all"
<======hang here.
Here is the log
[I]: Round 1: delete_ipr_array, "0 0:2:0:0"
[W]: Round 1: Array 0 is un-deletable at present.
[W]: Round 1: Array 0:2:0:0 is un-deletable at present.
[I]: Round 1: All remaining target arrayes are un-deletable now.
[I]: Round 1: Wait for these un-deletable arrayes deletable with tryCnt=360, tryInt=60.
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:55:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:56:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:57:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
|
1.0
|
[FVT]:2.11: xdsh <node> "configraid delete_raid=all" hangs when delete raid 0 - xCAT 2.11 on rh7.1
[root@~]# lsxcatd -v
Version 2.11 (git commit 2295560c5048801e80477a18e5267bcdb06a0814, built Thu Oct 15 19:30:47 EDT 2015)
[root@ ~]# xdsh p8euh diskdiscover
p8euh: --------------------------------------------------------------------------
p8euh: PCI_ID PCI_SLOT_NAME Resource_Path Device Description Status
p8euh: ------ ------------- ------------- ------ ----------- ----------------
p8euh: 1014:034a 0001:08:00.0 0:0:0:0 sg0 0 Array Member Active
p8euh: 1014:034a 0001:08:00.0 0:0:1:0 sg1 0 Array Member Active
p8euh: -------------------------------------------------------------------
p8euh: Get ipr RAID arrays by PCI_SLOT_NAME: 0001:08:00.0
p8euh: -------------------------------------------------------------------
p8euh: Name PCI/SCSI Location Description Status
p8euh: ------ ------------------------- ------------------------- -----------------
p8euh: sda 0001:08:00.0/0:2:0:0 RAID 0 Disk Array Optimized
[root@ ~]# xdsh p8euh "configraid delete_raid=all"
<======hang here.
Here is the log
[I]: Round 1: delete_ipr_array, "0 0:2:0:0"
[W]: Round 1: Array 0 is un-deletable at present.
[W]: Round 1: Array 0:2:0:0 is un-deletable at present.
[I]: Round 1: All remaining target arrayes are un-deletable now.
[I]: Round 1: Wait for these un-deletable arrayes deletable with tryCnt=360, tryInt=60.
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:55:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:56:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
[S]: Wait for device status at time "2015-10-20 21:57:12": status[0]="",status[0:2:0:0]="", expect: "grep -sq -E 'Optimized'".
/usr/bin/raidutils: line 1356: check_ipr_device_status: command not found
|
non_process
|
xdsh configraid delete raid all hangs when delete raid xcat on lsxcatd v version git commit built thu oct edt xdsh diskdiscover pci id pci slot name resource path device description status array member active array member active get ipr raid arrays by pci slot name name pci scsi location description status sda raid disk array optimized xdsh configraid delete raid all hang here here is the log round delete ipr array round array is un deletable at present round array is un deletable at present round all remaining target arrayes are un deletable now round wait for these un deletable arrayes deletable with trycnt tryint usr bin raidutils line check ipr device status command not found wait for device status at time status status expect grep sq e optimized usr bin raidutils line check ipr device status command not found wait for device status at time status status expect grep sq e optimized usr bin raidutils line check ipr device status command not found wait for device status at time status status expect grep sq e optimized usr bin raidutils line check ipr device status command not found
| 0
|
55,981
| 13,731,837,481
|
IssuesEvent
|
2020-10-05 02:34:06
|
tensorflow/tensorflow
|
https://api.github.com/repos/tensorflow/tensorflow
|
opened
|
Tensorflow does not work because of Cudnn library issue
|
type:build/install
|
<em>Please make sure that this is a build/installation issue. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:build_template</em>
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
- TensorFlow installed from (source or binary): Source
- TensorFlow version: 1.15.0
- Python version: 3.7
- Installed using virtualenv? pip? conda?: pip
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version: 10.0 and 7.4
- GPU model and memory: RTX 2060 6GB VRAM
**Describe the problem**
I first installed tf 2.3.0 with CUDA version: 10.1 and CUDnn version: 7.6. I had to work with some old code of mine so I uninstalled the previous CUDA, deleted all the files, and did a fresh install of the CUDA and CUDnn versions mentioned above along with tf 1.15.0. When I try to run the TensorFlow python program I keep getting the same error. It has something to do with source compilation of the previous CUDnn version (7.6.0), I cannot seem to find how to undo this, I tried reinstalling anaconda as well and checked all the environment variable paths of my system, all of them seem to point to the right directories.
**Provide the exact sequence of commands / steps that you executed before running into the problem**
I ran the file via command line using python main.py.
**Any other info / logs**
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
EPOCH 1 ...
2020-10-04 16:15:46.208813: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudnn64_7.dll
2020-10-04 16:15:47.780566: E tensorflow/stream_executor/cuda/cuda_dnn.cc:319] Loaded runtime CuDNN library: 7.4.2 but source was compiled with: 7.6.0. CuDNN library major and minor version needs to match or have higher minor version in case of CuDNN 7.0 or later version. If using a binary install, upgrade your CuDNN library. If building from sources, make sure the library loaded at runtime is compatible with the version specified during compile configuration.
2020-10-04 16:15:47.793953: E tensorflow/stream_executor/cuda/cuda_dnn.cc:319] Loaded runtime CuDNN library: 7.4.2 but source was compiled with: 7.6.0. CuDNN library major and minor version needs to match or have higher minor version in case of CuDNN 7.0 or later version. If using a binary install, upgrade your CuDNN library. If building from sources, make sure the library loaded at runtime is compatible with the version specified during compile configuration.
Traceback (most recent call last):
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1365, in _do_call
return fn(*args)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1350, in _run_fn
target_list, run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1443, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.UnknownError: 2 root error(s) found.
(0) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[{{node conv1_1/Conv2D}}]]
[[Mean/_73]]
(1) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[{{node conv1_1/Conv2D}}]]
0 successful operations.
0 derived errors ignored.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "main_test.py", line 202, in <module>
run()
File "main_test.py", line 193, in run
correct_label, keep_prob, learning_rate)
File "main_test.py", line 146, in train_nn
feed_dict={input_image: image, correct_label: label, keep_prob: 0.5, learning_rate: 0.0009})
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 956, in run
run_metadata_ptr)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1180, in _run
feed_dict_tensor, options, run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1359, in _do_run
run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1384, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.UnknownError: 2 root error(s) found.
(0) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv1_1/Conv2D (defined at C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py:1748) ]]
[[Mean/_73]]
(1) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv1_1/Conv2D (defined at C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py:1748) ]]
0 successful operations.
0 derived errors ignored.
Original stack trace for 'conv1_1/Conv2D':
File "main_test.py", line 202, in <module>
run()
File "main_test.py", line 184, in run
input_image, keep_prob, vgg_layer3_out, vgg_layer4_out, vgg_layer7_out = load_vgg(sess, vgg_path)
File "main_test.py", line 36, in load_vgg
tf.saved_model.loader.load(sess, [vgg_tag], vgg_path)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\util\deprecation.py", line 324, in new_func
return func(*args, **kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 269, in load
return loader.load(sess, tags, import_scope, **saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 422, in load
**saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 352, in load_graph
meta_graph_def, import_scope=import_scope, **saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\training\saver.py", line 1477, in _import_meta_graph_with_return_elements
**kwargs))
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\meta_graph.py", line 809, in import_scoped_meta_graph_with_return_elements
return_elements=return_elements)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\util\deprecation.py", line 507, in new_func
return func(*args, **kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 405, in import_graph_def
producer_op_list=producer_op_list)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 517, in _import_graph_def_internal
_ProcessNewOps(graph)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 243, in _ProcessNewOps
for new_op in graph._add_new_tf_operations(compute_devices=False): # pylint: disable=protected-access
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3561, in _add_new_tf_operations
for c_op in c_api_util.new_tf_operations(self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3561, in <listcomp>
for c_op in c_api_util.new_tf_operations(self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3451, in _create_op_from_tf_operation
ret = Operation(c_op, self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 1748, in __init__
self._traceback = tf_stack.extract_stack()
|
1.0
|
Tensorflow does not work because of Cudnn library issue - <em>Please make sure that this is a build/installation issue. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:build_template</em>
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
- TensorFlow installed from (source or binary): Source
- TensorFlow version: 1.15.0
- Python version: 3.7
- Installed using virtualenv? pip? conda?: pip
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version: 10.0 and 7.4
- GPU model and memory: RTX 2060 6GB VRAM
**Describe the problem**
I first installed tf 2.3.0 with CUDA version: 10.1 and CUDnn version: 7.6. I had to work with some old code of mine so I uninstalled the previous CUDA, deleted all the files, and did a fresh install of the CUDA and CUDnn versions mentioned above along with tf 1.15.0. When I try to run the TensorFlow python program I keep getting the same error. It has something to do with source compilation of the previous CUDnn version (7.6.0), I cannot seem to find how to undo this, I tried reinstalling anaconda as well and checked all the environment variable paths of my system, all of them seem to point to the right directories.
**Provide the exact sequence of commands / steps that you executed before running into the problem**
I ran the file via command line using python main.py.
**Any other info / logs**
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
EPOCH 1 ...
2020-10-04 16:15:46.208813: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudnn64_7.dll
2020-10-04 16:15:47.780566: E tensorflow/stream_executor/cuda/cuda_dnn.cc:319] Loaded runtime CuDNN library: 7.4.2 but source was compiled with: 7.6.0. CuDNN library major and minor version needs to match or have higher minor version in case of CuDNN 7.0 or later version. If using a binary install, upgrade your CuDNN library. If building from sources, make sure the library loaded at runtime is compatible with the version specified during compile configuration.
2020-10-04 16:15:47.793953: E tensorflow/stream_executor/cuda/cuda_dnn.cc:319] Loaded runtime CuDNN library: 7.4.2 but source was compiled with: 7.6.0. CuDNN library major and minor version needs to match or have higher minor version in case of CuDNN 7.0 or later version. If using a binary install, upgrade your CuDNN library. If building from sources, make sure the library loaded at runtime is compatible with the version specified during compile configuration.
Traceback (most recent call last):
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1365, in _do_call
return fn(*args)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1350, in _run_fn
target_list, run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1443, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.UnknownError: 2 root error(s) found.
(0) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[{{node conv1_1/Conv2D}}]]
[[Mean/_73]]
(1) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[{{node conv1_1/Conv2D}}]]
0 successful operations.
0 derived errors ignored.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "main_test.py", line 202, in <module>
run()
File "main_test.py", line 193, in run
correct_label, keep_prob, learning_rate)
File "main_test.py", line 146, in train_nn
feed_dict={input_image: image, correct_label: label, keep_prob: 0.5, learning_rate: 0.0009})
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 956, in run
run_metadata_ptr)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1180, in _run
feed_dict_tensor, options, run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1359, in _do_run
run_metadata)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\client\session.py", line 1384, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.UnknownError: 2 root error(s) found.
(0) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv1_1/Conv2D (defined at C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py:1748) ]]
[[Mean/_73]]
(1) Unknown: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above.
[[node conv1_1/Conv2D (defined at C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py:1748) ]]
0 successful operations.
0 derived errors ignored.
Original stack trace for 'conv1_1/Conv2D':
File "main_test.py", line 202, in <module>
run()
File "main_test.py", line 184, in run
input_image, keep_prob, vgg_layer3_out, vgg_layer4_out, vgg_layer7_out = load_vgg(sess, vgg_path)
File "main_test.py", line 36, in load_vgg
tf.saved_model.loader.load(sess, [vgg_tag], vgg_path)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\util\deprecation.py", line 324, in new_func
return func(*args, **kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 269, in load
return loader.load(sess, tags, import_scope, **saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 422, in load
**saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\saved_model\loader_impl.py", line 352, in load_graph
meta_graph_def, import_scope=import_scope, **saver_kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\training\saver.py", line 1477, in _import_meta_graph_with_return_elements
**kwargs))
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\meta_graph.py", line 809, in import_scoped_meta_graph_with_return_elements
return_elements=return_elements)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\util\deprecation.py", line 507, in new_func
return func(*args, **kwargs)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 405, in import_graph_def
producer_op_list=producer_op_list)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 517, in _import_graph_def_internal
_ProcessNewOps(graph)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\importer.py", line 243, in _ProcessNewOps
for new_op in graph._add_new_tf_operations(compute_devices=False): # pylint: disable=protected-access
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3561, in _add_new_tf_operations
for c_op in c_api_util.new_tf_operations(self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3561, in <listcomp>
for c_op in c_api_util.new_tf_operations(self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3451, in _create_op_from_tf_operation
ret = Operation(c_op, self)
File "C:\Users\manas\anaconda3\envs\tensorflow1-gpu\lib\site-packages\tensorflow_core\python\framework\ops.py", line 1748, in __init__
self._traceback = tf_stack.extract_stack()
|
non_process
|
tensorflow does not work because of cudnn library issue please make sure that this is a build installation issue as per our we only address code doc bugs performance issues feature requests and build installation issues on github tag build template system information os platform and distribution e g linux ubuntu windows mobile device e g iphone pixel samsung galaxy if the issue happens on mobile device tensorflow installed from source or binary source tensorflow version python version installed using virtualenv pip conda pip bazel version if compiling from source gcc compiler version if compiling from source cuda cudnn version and gpu model and memory rtx vram describe the problem i first installed tf with cuda version and cudnn version i had to work with some old code of mine so i uninstalled the previous cuda deleted all the files and did a fresh install of the cuda and cudnn versions mentioned above along with tf when i try to run the tensorflow python program i keep getting the same error it has something to do with source compilation of the previous cudnn version i cannot seem to find how to undo this i tried reinstalling anaconda as well and checked all the environment variable paths of my system all of them seem to point to the right directories provide the exact sequence of commands steps that you executed before running into the problem i ran the file via command line using python main py any other info logs include any logs or source code that would be helpful to diagnose the problem if including tracebacks please include the full traceback large logs and files should be attached epoch i tensorflow stream executor platform default dso loader cc successfully opened dynamic library dll e tensorflow stream executor cuda cuda dnn cc loaded runtime cudnn library but source was compiled with cudnn library major and minor version needs to match or have higher minor version in case of cudnn or later version if using a binary install upgrade your cudnn library if building from sources make sure the library loaded at runtime is compatible with the version specified during compile configuration e tensorflow stream executor cuda cuda dnn cc loaded runtime cudnn library but source was compiled with cudnn library major and minor version needs to match or have higher minor version in case of cudnn or later version if using a binary install upgrade your cudnn library if building from sources make sure the library loaded at runtime is compatible with the version specified during compile configuration traceback most recent call last file c users manas envs gpu lib site packages tensorflow core python client session py line in do call return fn args file c users manas envs gpu lib site packages tensorflow core python client session py line in run fn target list run metadata file c users manas envs gpu lib site packages tensorflow core python client session py line in call tf sessionrun run metadata tensorflow python framework errors impl unknownerror root error s found unknown failed to get convolution algorithm this is probably because cudnn failed to initialize so try looking to see if a warning log message was printed above unknown failed to get convolution algorithm this is probably because cudnn failed to initialize so try looking to see if a warning log message was printed above successful operations derived errors ignored during handling of the above exception another exception occurred traceback most recent call last file main test py line in run file main test py line in run correct label keep prob learning rate file main test py line in train nn feed dict input image image correct label label keep prob learning rate file c users manas envs gpu lib site packages tensorflow core python client session py line in run run metadata ptr file c users manas envs gpu lib site packages tensorflow core python client session py line in run feed dict tensor options run metadata file c users manas envs gpu lib site packages tensorflow core python client session py line in do run run metadata file c users manas envs gpu lib site packages tensorflow core python client session py line in do call raise type e node def op message tensorflow python framework errors impl unknownerror root error s found unknown failed to get convolution algorithm this is probably because cudnn failed to initialize so try looking to see if a warning log message was printed above unknown failed to get convolution algorithm this is probably because cudnn failed to initialize so try looking to see if a warning log message was printed above successful operations derived errors ignored original stack trace for file main test py line in run file main test py line in run input image keep prob vgg out vgg out vgg out load vgg sess vgg path file main test py line in load vgg tf saved model loader load sess vgg path file c users manas envs gpu lib site packages tensorflow core python util deprecation py line in new func return func args kwargs file c users manas envs gpu lib site packages tensorflow core python saved model loader impl py line in load return loader load sess tags import scope saver kwargs file c users manas envs gpu lib site packages tensorflow core python saved model loader impl py line in load saver kwargs file c users manas envs gpu lib site packages tensorflow core python saved model loader impl py line in load graph meta graph def import scope import scope saver kwargs file c users manas envs gpu lib site packages tensorflow core python training saver py line in import meta graph with return elements kwargs file c users manas envs gpu lib site packages tensorflow core python framework meta graph py line in import scoped meta graph with return elements return elements return elements file c users manas envs gpu lib site packages tensorflow core python util deprecation py line in new func return func args kwargs file c users manas envs gpu lib site packages tensorflow core python framework importer py line in import graph def producer op list producer op list file c users manas envs gpu lib site packages tensorflow core python framework importer py line in import graph def internal processnewops graph file c users manas envs gpu lib site packages tensorflow core python framework importer py line in processnewops for new op in graph add new tf operations compute devices false pylint disable protected access file c users manas envs gpu lib site packages tensorflow core python framework ops py line in add new tf operations for c op in c api util new tf operations self file c users manas envs gpu lib site packages tensorflow core python framework ops py line in for c op in c api util new tf operations self file c users manas envs gpu lib site packages tensorflow core python framework ops py line in create op from tf operation ret operation c op self file c users manas envs gpu lib site packages tensorflow core python framework ops py line in init self traceback tf stack extract stack
| 0
|
523,909
| 15,191,949,952
|
IssuesEvent
|
2021-02-15 20:57:49
|
apcountryman/picolibrary
|
https://api.github.com/repos/apcountryman/picolibrary
|
opened
|
Add I2C ping
|
priority-normal status-awaiting_development type-feature
|
Add I2C ping (`::picolibrary::I2C::ping()`). This function will allow the user to check if a device is nonresponsive. This function should have the following signature:
```c++
template<typename Controller>
auto ping( Controller & controller, Address address, Operation operation ) noexcept -> Result<Void, Error_Code>
```
|
1.0
|
Add I2C ping - Add I2C ping (`::picolibrary::I2C::ping()`). This function will allow the user to check if a device is nonresponsive. This function should have the following signature:
```c++
template<typename Controller>
auto ping( Controller & controller, Address address, Operation operation ) noexcept -> Result<Void, Error_Code>
```
|
non_process
|
add ping add ping picolibrary ping this function will allow the user to check if a device is nonresponsive this function should have the following signature c template auto ping controller controller address address operation operation noexcept result
| 0
|
18,483
| 24,550,777,761
|
IssuesEvent
|
2022-10-12 12:27:00
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[iOS] Hamburger menu icon > Sections are non functional in the following scenario
|
Bug P1 iOS Process: Fixed Process: Tested QA Process: Tested dev
|
Steps:
1. Install the app
2. Sign up or login to the app
3. Click on Hamburger menu icon
4. Minimize the app / lock the mobile
5. Maximize/unlock the mobile
6. Enter the passcode
7. Try to click on any section present on the hamburger menu
AR: All the sections are non functional
ER: All the sections should be functional and participant should navigate to proper sections when they clicks on the each section

|
3.0
|
[iOS] Hamburger menu icon > Sections are non functional in the following scenario - Steps:
1. Install the app
2. Sign up or login to the app
3. Click on Hamburger menu icon
4. Minimize the app / lock the mobile
5. Maximize/unlock the mobile
6. Enter the passcode
7. Try to click on any section present on the hamburger menu
AR: All the sections are non functional
ER: All the sections should be functional and participant should navigate to proper sections when they clicks on the each section

|
process
|
hamburger menu icon sections are non functional in the following scenario steps install the app sign up or login to the app click on hamburger menu icon minimize the app lock the mobile maximize unlock the mobile enter the passcode try to click on any section present on the hamburger menu ar all the sections are non functional er all the sections should be functional and participant should navigate to proper sections when they clicks on the each section
| 1
|
4,689
| 7,525,433,376
|
IssuesEvent
|
2018-04-13 10:33:09
|
ODiogoSilva/assemblerflow
|
https://api.github.com/repos/ODiogoSilva/assemblerflow
|
opened
|
Check for forbidden characters in groovy variables for configs
|
process
|
Characters such as "-" should be forbidden in parameter names
|
1.0
|
Check for forbidden characters in groovy variables for configs - Characters such as "-" should be forbidden in parameter names
|
process
|
check for forbidden characters in groovy variables for configs characters such as should be forbidden in parameter names
| 1
|
18,788
| 11,049,204,806
|
IssuesEvent
|
2019-12-09 23:00:48
|
cityofaustin/atd-vz-data
|
https://api.github.com/repos/cityofaustin/atd-vz-data
|
opened
|
VZV: Contributing factors
|
Need: 3-Could Have Project: Vision Zero Viewer Service: PM Workgroup: VZ
|
Look into a "contributing factors" visualization. This type of attribution may be difficult to determine using CRIS quantitative data. I did find this 2016 Vision Zero article from KUT referencing contributing factors for traffic fatalities in Austin:
https://www.kut.org/post/what-exactly-vision-zero-plan-and-why-does-austin-need-one
|
1.0
|
VZV: Contributing factors - Look into a "contributing factors" visualization. This type of attribution may be difficult to determine using CRIS quantitative data. I did find this 2016 Vision Zero article from KUT referencing contributing factors for traffic fatalities in Austin:
https://www.kut.org/post/what-exactly-vision-zero-plan-and-why-does-austin-need-one
|
non_process
|
vzv contributing factors look into a contributing factors visualization this type of attribution may be difficult to determine using cris quantitative data i did find this vision zero article from kut referencing contributing factors for traffic fatalities in austin
| 0
|
10,956
| 13,758,456,260
|
IssuesEvent
|
2020-10-07 00:05:48
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Processing's batch process performs badly when adding a large number of files
|
Bug Processing
|
When auto-filling the batch table with a large number of files, QGIS will freeze for a *long* time (e.g. on one machine, 3000 files = +/-20 minutes :scream: ).
Here's a hotspot graph output that indicates a couple of bottlenecks:

|
1.0
|
Processing's batch process performs badly when adding a large number of files - When auto-filling the batch table with a large number of files, QGIS will freeze for a *long* time (e.g. on one machine, 3000 files = +/-20 minutes :scream: ).
Here's a hotspot graph output that indicates a couple of bottlenecks:

|
process
|
processing s batch process performs badly when adding a large number of files when auto filling the batch table with a large number of files qgis will freeze for a long time e g on one machine files minutes scream here s a hotspot graph output that indicates a couple of bottlenecks
| 1
|
4,239
| 7,187,109,699
|
IssuesEvent
|
2018-02-02 02:59:51
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
closed
|
Geth traces are different than Parity traces.
|
monitors-all status-inprocess type-enhancement
|
I know the exact format of the data I need. It's identical to that provided by Parity. It will be brutally slow.
|
1.0
|
Geth traces are different than Parity traces. - I know the exact format of the data I need. It's identical to that provided by Parity. It will be brutally slow.
|
process
|
geth traces are different than parity traces i know the exact format of the data i need it s identical to that provided by parity it will be brutally slow
| 1
|
3,001
| 5,996,505,373
|
IssuesEvent
|
2017-06-03 14:57:55
|
rg3/youtube-dl
|
https://api.github.com/repos/rg3/youtube-dl
|
closed
|
[Youtube] Metadata extracted from video title does not get written to extracted mp3 file
|
bug postprocessors
|
## Please follow the guide below
- You will be asked some questions and requested to provide some information, please read them **carefully** and answer honestly
- Put an `x` into all the boxes [ ] relevant to your _issue_ (like that [x])
- Use _Preview_ tab to see how your issue will actually look like
---
### Make sure you are using the _latest_ version: run `youtube-dl --version` and ensure your version is _2016.10.25_. If it's not read [this FAQ entry](https://github.com/rg3/youtube-dl/blob/master/README.md#how-do-i-update-youtube-dl) and update. Issues with outdated version will be rejected.
- [x] I've **verified** and **I assure** that I'm running youtube-dl **2016.10.25**
### Before submitting an _issue_ make sure you have:
- [x] At least skimmed through [README](https://github.com/rg3/youtube-dl/blob/master/README.md) and **most notably** [FAQ](https://github.com/rg3/youtube-dl#faq) and [BUGS](https://github.com/rg3/youtube-dl#bugs) sections
- [x] [Searched](https://github.com/rg3/youtube-dl/search?type=Issues) the bugtracker for similar issues including closed ones
### What is the purpose of your _issue_?
- [x] Bug report (encountered problems with youtube-dl)
- [ ] Site support request (request for adding support for a new site)
- [ ] Feature request (request for a new functionality)
- [x] Question
- [ ] Other
---
### The following sections concretize particular purposed issues, you can erase any section (the contents between triple ---) not applicable to your _issue_
---
### If the purpose of this _issue_ is a _bug report_, _site support request_ or you are not completely sure provide the full verbose output as follows:
Add `-v` flag to **your command line** you run youtube-dl with, copy the **whole** output and insert it here. It should look similar to one below (replace it with **your** log inserted between triple ```):
```
$ youtube-dl.exe https://www.youtube.com/watch?v=MQnyoAyxi8M -x --embed-thumbnail -o %(title)s.%(ext)s -i --metadata-from-title "%(artist)s - %(title)s" --add-metadata --audio-format "mp3" --audio-quality 0 -v
[debug] System config: []
[debug] User config: []
[debug] Command-line args: ['https://www.youtube.com/watch?v=MQnyoAyxi8M', '-x', '--embed-thumbnail', '-o', '%(title)s.%(ext)s', '-i', '--metadata-from-title', '%(artist)s - %(title)s', '--add-metadata', '--audio-format', 'mp3', '--audio-quality', '0', '-v']
[debug] Encodings: locale cp1252, fs mbcs, out cp437, pref cp1252
[debug] youtube-dl version 2016.10.25
[debug] Python version 3.4.4 - Windows-10-10.0.14393
[debug] exe versions: ffmpeg N-82080-g6969bed, ffprobe N-82080-g6969bed
[debug] Proxy map: {}
[youtube] MQnyoAyxi8M: Downloading webpage
[youtube] MQnyoAyxi8M: Downloading video info webpage
[youtube] MQnyoAyxi8M: Extracting video information
[youtube] MQnyoAyxi8M: Downloading MPD manifest
[youtube] MQnyoAyxi8M: Downloading thumbnail ...
[youtube] MQnyoAyxi8M: Writing thumbnail to: SRNO - Stay Sane (ft. Naaz).jpg
[debug] Invoking downloader on 'https://r1---sn-p5qlsnsr.googlevideo.com/videoplayback?id=3109f2a00cb18bc3&itag=251&source=youtube&requiressl=yes&pl=21&nh=IgpwcjA2LmxnYTA3KgkxMjcuMC4wLjE&ms=au&mv=m&mm=31&mn=sn-p5qlsnsr&initcwndbps=2887500&ratebypass=yes&mime=audio/webm&gir=yes&clen=3331531&lmt=1477323058844747&dur=194.001&upn=OyrV2DB99Sk&key=dg_yt0&signature=3AA78091F9D875574BAB0F9FF056721F7E641695.7E475D5D225E9606C26F8DA80DA895C913629954&mt=1477345239&ip=209.222.15.233&ipbits=0&expire=1477367062&sparams=ip,ipbits,expire,id,itag,source,requiressl,pl,nh,ms,mv,mm,mn,initcwndbps,ratebypass,mime,gir,clen,lmt,dur'
[download] Destination: SRNO - Stay Sane (ft. Naaz).webm
[download] 100% of 3.18MiB in 00:02
[fromtitle] parsed artist: SRNO
[fromtitle] parsed title: Stay Sane (ft. Naaz)
[ffmpeg] Adding metadata to 'SRNO - Stay Sane (ft. Naaz).webm'
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).webm' -c copy -metadata 'description=Majestic Casual - Experience music in a new way.
● Spotify ▸ http://smarturl.it/majesticcspotify
● Facebook ▸ https://facebook.com/majesticcasual
● SoundCloud ▸ http://soundcloud.com/majesticcasual
● Instagram ▸ https://instagram.com/majesticcasual
● Twitter ▸ https://twitter.com/majesticcasual
● Snapchat ▸ '"'"'majesticcasual'"'"'
● Stream on Spotify ▸ http://smarturl.it/staysaneSPOTIFY
● Listen on SoundCloud ▸http://smarturl.it/staysaneSC
● Free Download ▸ http://smarturl.it/staysaneDL
Follow SRNO
● https://facebook.com/prodbysrno
● https://soundcloud.com/srno
● https://twitter.com/prodbysrno
Follow Naaz
● https://facebook.com/bitsofnaaz
● https://soundcloud.com/bitsofnaaz
● https://twitter.com/bitsofnaaz
Picture © WeliWaca Film Gallery
● https://www.facebook.com/weliwacagallery/' -metadata date=20161024 -metadata 'comment=Majestic Casual - Experience music in a new way.
● Spotify ▸ http://smarturl.it/majesticcspotify
● Facebook ▸ https://facebook.com/majesticcasual
● SoundCloud ▸ http://soundcloud.com/majesticcasual
● Instagram ▸ https://instagram.com/majesticcasual
● Twitter ▸ https://twitter.com/majesticcasual
● Snapchat ▸ '"'"'majesticcasual'"'"'
● Stream on Spotify ▸ http://smarturl.it/staysaneSPOTIFY
● Listen on SoundCloud ▸http://smarturl.it/staysaneSC
● Free Download ▸ http://smarturl.it/staysaneDL
Follow SRNO
● https://facebook.com/prodbysrno
● https://soundcloud.com/srno
● https://twitter.com/prodbysrno
Follow Naaz
● https://facebook.com/bitsofnaaz
● https://soundcloud.com/bitsofnaaz
● https://twitter.com/bitsofnaaz
Picture © WeliWaca Film Gallery
● https://www.facebook.com/weliwacagallery/' -metadata 'purl=https://www.youtube.com/watch?v=MQnyoAyxi8M' -metadata artist=SRNO -metadata 'title=Stay Sane (ft. Naaz)' 'file:SRNO - Stay Sane (ft. Naaz).temp.webm'
[debug] ffmpeg command line: ffprobe -show_streams 'file:SRNO - Stay Sane (ft. Naaz).webm'
[ffmpeg] Destination: SRNO - Stay Sane (ft. Naaz).mp3
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).webm' -vn -acodec libmp3lame -q:a 0 'file:SRNO - Stay Sane (ft. Naaz).mp3'
Deleting original file SRNO - Stay Sane (ft. Naaz).webm (pass -k to keep)
[ffmpeg] Adding thumbnail to "SRNO - Stay Sane (ft. Naaz).mp3"
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).mp3' -i 'file:SRNO - Stay Sane (ft. Naaz).jpg' -c copy -map 0 -map 1 -metadata:s:v 'title="Album cover"' -metadata:s:v 'comment="Cover (Front)"' 'file:SRNO - Stay Sane (ft. Naaz).temp.mp3'
```
---
### Description of your _issue_, suggested solution and other information
When trying to add metadata to a song extracted from a youtube video, the program correctly detects the artist and song name. FFMPEG shows it is writing metadata (artist and song title) but when the audio is extracted using -x, the resulting mp3 file does not have the artist and song title embedded in the metadata. I think the metadata is lost when the audio is extracted using -x. I did a search and people have been successful at achieving this in threads such as [(http://stackoverflow.com/questions/30376172/downloading-youtube-mp3-metadata-encoding-issue-python-youtube-dl-ffmpeg)] where the user is using a different approach (using a python application), but the command is similar. Is it possible to have youtube-dl command ffmpeg to write metadata to the extracted mp3 file rather than writing metadata to the downloaded webm video then converting to mp3? Or am I missing or incorrectly using an argument?
Thanks in advance
|
1.0
|
[Youtube] Metadata extracted from video title does not get written to extracted mp3 file - ## Please follow the guide below
- You will be asked some questions and requested to provide some information, please read them **carefully** and answer honestly
- Put an `x` into all the boxes [ ] relevant to your _issue_ (like that [x])
- Use _Preview_ tab to see how your issue will actually look like
---
### Make sure you are using the _latest_ version: run `youtube-dl --version` and ensure your version is _2016.10.25_. If it's not read [this FAQ entry](https://github.com/rg3/youtube-dl/blob/master/README.md#how-do-i-update-youtube-dl) and update. Issues with outdated version will be rejected.
- [x] I've **verified** and **I assure** that I'm running youtube-dl **2016.10.25**
### Before submitting an _issue_ make sure you have:
- [x] At least skimmed through [README](https://github.com/rg3/youtube-dl/blob/master/README.md) and **most notably** [FAQ](https://github.com/rg3/youtube-dl#faq) and [BUGS](https://github.com/rg3/youtube-dl#bugs) sections
- [x] [Searched](https://github.com/rg3/youtube-dl/search?type=Issues) the bugtracker for similar issues including closed ones
### What is the purpose of your _issue_?
- [x] Bug report (encountered problems with youtube-dl)
- [ ] Site support request (request for adding support for a new site)
- [ ] Feature request (request for a new functionality)
- [x] Question
- [ ] Other
---
### The following sections concretize particular purposed issues, you can erase any section (the contents between triple ---) not applicable to your _issue_
---
### If the purpose of this _issue_ is a _bug report_, _site support request_ or you are not completely sure provide the full verbose output as follows:
Add `-v` flag to **your command line** you run youtube-dl with, copy the **whole** output and insert it here. It should look similar to one below (replace it with **your** log inserted between triple ```):
```
$ youtube-dl.exe https://www.youtube.com/watch?v=MQnyoAyxi8M -x --embed-thumbnail -o %(title)s.%(ext)s -i --metadata-from-title "%(artist)s - %(title)s" --add-metadata --audio-format "mp3" --audio-quality 0 -v
[debug] System config: []
[debug] User config: []
[debug] Command-line args: ['https://www.youtube.com/watch?v=MQnyoAyxi8M', '-x', '--embed-thumbnail', '-o', '%(title)s.%(ext)s', '-i', '--metadata-from-title', '%(artist)s - %(title)s', '--add-metadata', '--audio-format', 'mp3', '--audio-quality', '0', '-v']
[debug] Encodings: locale cp1252, fs mbcs, out cp437, pref cp1252
[debug] youtube-dl version 2016.10.25
[debug] Python version 3.4.4 - Windows-10-10.0.14393
[debug] exe versions: ffmpeg N-82080-g6969bed, ffprobe N-82080-g6969bed
[debug] Proxy map: {}
[youtube] MQnyoAyxi8M: Downloading webpage
[youtube] MQnyoAyxi8M: Downloading video info webpage
[youtube] MQnyoAyxi8M: Extracting video information
[youtube] MQnyoAyxi8M: Downloading MPD manifest
[youtube] MQnyoAyxi8M: Downloading thumbnail ...
[youtube] MQnyoAyxi8M: Writing thumbnail to: SRNO - Stay Sane (ft. Naaz).jpg
[debug] Invoking downloader on 'https://r1---sn-p5qlsnsr.googlevideo.com/videoplayback?id=3109f2a00cb18bc3&itag=251&source=youtube&requiressl=yes&pl=21&nh=IgpwcjA2LmxnYTA3KgkxMjcuMC4wLjE&ms=au&mv=m&mm=31&mn=sn-p5qlsnsr&initcwndbps=2887500&ratebypass=yes&mime=audio/webm&gir=yes&clen=3331531&lmt=1477323058844747&dur=194.001&upn=OyrV2DB99Sk&key=dg_yt0&signature=3AA78091F9D875574BAB0F9FF056721F7E641695.7E475D5D225E9606C26F8DA80DA895C913629954&mt=1477345239&ip=209.222.15.233&ipbits=0&expire=1477367062&sparams=ip,ipbits,expire,id,itag,source,requiressl,pl,nh,ms,mv,mm,mn,initcwndbps,ratebypass,mime,gir,clen,lmt,dur'
[download] Destination: SRNO - Stay Sane (ft. Naaz).webm
[download] 100% of 3.18MiB in 00:02
[fromtitle] parsed artist: SRNO
[fromtitle] parsed title: Stay Sane (ft. Naaz)
[ffmpeg] Adding metadata to 'SRNO - Stay Sane (ft. Naaz).webm'
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).webm' -c copy -metadata 'description=Majestic Casual - Experience music in a new way.
● Spotify ▸ http://smarturl.it/majesticcspotify
● Facebook ▸ https://facebook.com/majesticcasual
● SoundCloud ▸ http://soundcloud.com/majesticcasual
● Instagram ▸ https://instagram.com/majesticcasual
● Twitter ▸ https://twitter.com/majesticcasual
● Snapchat ▸ '"'"'majesticcasual'"'"'
● Stream on Spotify ▸ http://smarturl.it/staysaneSPOTIFY
● Listen on SoundCloud ▸http://smarturl.it/staysaneSC
● Free Download ▸ http://smarturl.it/staysaneDL
Follow SRNO
● https://facebook.com/prodbysrno
● https://soundcloud.com/srno
● https://twitter.com/prodbysrno
Follow Naaz
● https://facebook.com/bitsofnaaz
● https://soundcloud.com/bitsofnaaz
● https://twitter.com/bitsofnaaz
Picture © WeliWaca Film Gallery
● https://www.facebook.com/weliwacagallery/' -metadata date=20161024 -metadata 'comment=Majestic Casual - Experience music in a new way.
● Spotify ▸ http://smarturl.it/majesticcspotify
● Facebook ▸ https://facebook.com/majesticcasual
● SoundCloud ▸ http://soundcloud.com/majesticcasual
● Instagram ▸ https://instagram.com/majesticcasual
● Twitter ▸ https://twitter.com/majesticcasual
● Snapchat ▸ '"'"'majesticcasual'"'"'
● Stream on Spotify ▸ http://smarturl.it/staysaneSPOTIFY
● Listen on SoundCloud ▸http://smarturl.it/staysaneSC
● Free Download ▸ http://smarturl.it/staysaneDL
Follow SRNO
● https://facebook.com/prodbysrno
● https://soundcloud.com/srno
● https://twitter.com/prodbysrno
Follow Naaz
● https://facebook.com/bitsofnaaz
● https://soundcloud.com/bitsofnaaz
● https://twitter.com/bitsofnaaz
Picture © WeliWaca Film Gallery
● https://www.facebook.com/weliwacagallery/' -metadata 'purl=https://www.youtube.com/watch?v=MQnyoAyxi8M' -metadata artist=SRNO -metadata 'title=Stay Sane (ft. Naaz)' 'file:SRNO - Stay Sane (ft. Naaz).temp.webm'
[debug] ffmpeg command line: ffprobe -show_streams 'file:SRNO - Stay Sane (ft. Naaz).webm'
[ffmpeg] Destination: SRNO - Stay Sane (ft. Naaz).mp3
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).webm' -vn -acodec libmp3lame -q:a 0 'file:SRNO - Stay Sane (ft. Naaz).mp3'
Deleting original file SRNO - Stay Sane (ft. Naaz).webm (pass -k to keep)
[ffmpeg] Adding thumbnail to "SRNO - Stay Sane (ft. Naaz).mp3"
[debug] ffmpeg command line: ffmpeg -y -i 'file:SRNO - Stay Sane (ft. Naaz).mp3' -i 'file:SRNO - Stay Sane (ft. Naaz).jpg' -c copy -map 0 -map 1 -metadata:s:v 'title="Album cover"' -metadata:s:v 'comment="Cover (Front)"' 'file:SRNO - Stay Sane (ft. Naaz).temp.mp3'
```
---
### Description of your _issue_, suggested solution and other information
When trying to add metadata to a song extracted from a youtube video, the program correctly detects the artist and song name. FFMPEG shows it is writing metadata (artist and song title) but when the audio is extracted using -x, the resulting mp3 file does not have the artist and song title embedded in the metadata. I think the metadata is lost when the audio is extracted using -x. I did a search and people have been successful at achieving this in threads such as [(http://stackoverflow.com/questions/30376172/downloading-youtube-mp3-metadata-encoding-issue-python-youtube-dl-ffmpeg)] where the user is using a different approach (using a python application), but the command is similar. Is it possible to have youtube-dl command ffmpeg to write metadata to the extracted mp3 file rather than writing metadata to the downloaded webm video then converting to mp3? Or am I missing or incorrectly using an argument?
Thanks in advance
|
process
|
metadata extracted from video title does not get written to extracted file please follow the guide below you will be asked some questions and requested to provide some information please read them carefully and answer honestly put an x into all the boxes relevant to your issue like that use preview tab to see how your issue will actually look like make sure you are using the latest version run youtube dl version and ensure your version is if it s not read and update issues with outdated version will be rejected i ve verified and i assure that i m running youtube dl before submitting an issue make sure you have at least skimmed through and most notably and sections the bugtracker for similar issues including closed ones what is the purpose of your issue bug report encountered problems with youtube dl site support request request for adding support for a new site feature request request for a new functionality question other the following sections concretize particular purposed issues you can erase any section the contents between triple not applicable to your issue if the purpose of this issue is a bug report site support request or you are not completely sure provide the full verbose output as follows add v flag to your command line you run youtube dl with copy the whole output and insert it here it should look similar to one below replace it with your log inserted between triple youtube dl exe x embed thumbnail o title s ext s i metadata from title artist s title s add metadata audio format audio quality v system config user config command line args encodings locale fs mbcs out pref youtube dl version python version windows exe versions ffmpeg n ffprobe n proxy map downloading webpage downloading video info webpage extracting video information downloading mpd manifest downloading thumbnail writing thumbnail to srno stay sane ft naaz jpg invoking downloader on destination srno stay sane ft naaz webm of in parsed artist srno parsed title stay sane ft naaz adding metadata to srno stay sane ft naaz webm ffmpeg command line ffmpeg y i file srno stay sane ft naaz webm c copy metadata description majestic casual experience music in a new way ● spotify ▸ ● facebook ▸ ● soundcloud ▸ ● instagram ▸ ● twitter ▸ ● snapchat ▸ majesticcasual ● stream on spotify ▸ ● listen on soundcloud ▸ ● free download ▸ follow srno ● ● ● follow naaz ● ● ● picture © weliwaca film gallery ● metadata date metadata comment majestic casual experience music in a new way ● spotify ▸ ● facebook ▸ ● soundcloud ▸ ● instagram ▸ ● twitter ▸ ● snapchat ▸ majesticcasual ● stream on spotify ▸ ● listen on soundcloud ▸ ● free download ▸ follow srno ● ● ● follow naaz ● ● ● picture © weliwaca film gallery ● metadata purl metadata artist srno metadata title stay sane ft naaz file srno stay sane ft naaz temp webm ffmpeg command line ffprobe show streams file srno stay sane ft naaz webm destination srno stay sane ft naaz ffmpeg command line ffmpeg y i file srno stay sane ft naaz webm vn acodec q a file srno stay sane ft naaz deleting original file srno stay sane ft naaz webm pass k to keep adding thumbnail to srno stay sane ft naaz ffmpeg command line ffmpeg y i file srno stay sane ft naaz i file srno stay sane ft naaz jpg c copy map map metadata s v title album cover metadata s v comment cover front file srno stay sane ft naaz temp description of your issue suggested solution and other information when trying to add metadata to a song extracted from a youtube video the program correctly detects the artist and song name ffmpeg shows it is writing metadata artist and song title but when the audio is extracted using x the resulting file does not have the artist and song title embedded in the metadata i think the metadata is lost when the audio is extracted using x i did a search and people have been successful at achieving this in threads such as where the user is using a different approach using a python application but the command is similar is it possible to have youtube dl command ffmpeg to write metadata to the extracted file rather than writing metadata to the downloaded webm video then converting to or am i missing or incorrectly using an argument thanks in advance
| 1
|
16,996
| 22,358,270,735
|
IssuesEvent
|
2022-06-15 17:43:22
|
googleapis/repo-automation-bots
|
https://api.github.com/repos/googleapis/repo-automation-bots
|
closed
|
sync-repo-settings: debug the edge cases that are responding with exceptions in sync-repo-settings
|
type: process bot: sync-repo-settings
|
To unblock failures we're seeing in our task queues, I've added back some error handling to `sync-repo-settings`:
https://github.com/googleapis/repo-automation-bots/pull/722
I haven't enumerated a list of which repos throw these specific errors, someone should take on the work to do this and see if we can remove additional catches.
@sofisl points out that GitHub's error codes might not be granular enough to truly identify certain types of errors, we should also consider whether we could look at more specific components of the payload to decide the type of error.
CC: @sofisl, @JustinBeckwith
|
1.0
|
sync-repo-settings: debug the edge cases that are responding with exceptions in sync-repo-settings - To unblock failures we're seeing in our task queues, I've added back some error handling to `sync-repo-settings`:
https://github.com/googleapis/repo-automation-bots/pull/722
I haven't enumerated a list of which repos throw these specific errors, someone should take on the work to do this and see if we can remove additional catches.
@sofisl points out that GitHub's error codes might not be granular enough to truly identify certain types of errors, we should also consider whether we could look at more specific components of the payload to decide the type of error.
CC: @sofisl, @JustinBeckwith
|
process
|
sync repo settings debug the edge cases that are responding with exceptions in sync repo settings to unblock failures we re seeing in our task queues i ve added back some error handling to sync repo settings i haven t enumerated a list of which repos throw these specific errors someone should take on the work to do this and see if we can remove additional catches sofisl points out that github s error codes might not be granular enough to truly identify certain types of errors we should also consider whether we could look at more specific components of the payload to decide the type of error cc sofisl justinbeckwith
| 1
|
23,556
| 4,027,909,732
|
IssuesEvent
|
2016-05-18 02:18:11
|
start-jsk/jsk_apc
|
https://api.github.com/repos/start-jsk/jsk_apc
|
closed
|
Test for SIB
|
test
|
We need tests for Segmentation In Bin. Without tests, it takes time to reproduce bugs.
|
1.0
|
Test for SIB - We need tests for Segmentation In Bin. Without tests, it takes time to reproduce bugs.
|
non_process
|
test for sib we need tests for segmentation in bin without tests it takes time to reproduce bugs
| 0
|
105,250
| 13,171,948,133
|
IssuesEvent
|
2020-08-11 17:33:44
|
FordLabs/PeopleMover
|
https://api.github.com/repos/FordLabs/PeopleMover
|
closed
|
Calendar Date Dropdown - Part 6: Making a new assignment for a person can affect their future assignments
|
:stop_sign: ! Design Needed Enhancement
|
# Feature Request
## Context
Not sure if this needed. The revert button on future dates may work just fine for this and having the pop-up may be annoying for people who want to play out two different versions of the future on days right after each other. Wait and see how calendar and reverting options are used.
## Story
**As a** People Allocator
**I want** to know about a person's future reassignments when making a new one
**So** that I can decide if I still want that person to move in the future
----
## Acceptance Criteria
**Given** Person 1 is on product A on day 1
**When** I move person 1 from product A to product C on day 3, and then I move person 1 from product A to product B on day 2
**Then** I see a prompt to let me either cancel the reassignment scheduled for day 3 (so person 1 remains on product B), or maintain the reassignment for day 3 (so person 1 will continue to shift to product C).
## Notes
This prompt will appear and apply to ALL reassignments (cancel or maintain) for 30 days into the future for that person.
|
1.0
|
Calendar Date Dropdown - Part 6: Making a new assignment for a person can affect their future assignments - # Feature Request
## Context
Not sure if this needed. The revert button on future dates may work just fine for this and having the pop-up may be annoying for people who want to play out two different versions of the future on days right after each other. Wait and see how calendar and reverting options are used.
## Story
**As a** People Allocator
**I want** to know about a person's future reassignments when making a new one
**So** that I can decide if I still want that person to move in the future
----
## Acceptance Criteria
**Given** Person 1 is on product A on day 1
**When** I move person 1 from product A to product C on day 3, and then I move person 1 from product A to product B on day 2
**Then** I see a prompt to let me either cancel the reassignment scheduled for day 3 (so person 1 remains on product B), or maintain the reassignment for day 3 (so person 1 will continue to shift to product C).
## Notes
This prompt will appear and apply to ALL reassignments (cancel or maintain) for 30 days into the future for that person.
|
non_process
|
calendar date dropdown part making a new assignment for a person can affect their future assignments feature request context not sure if this needed the revert button on future dates may work just fine for this and having the pop up may be annoying for people who want to play out two different versions of the future on days right after each other wait and see how calendar and reverting options are used story as a people allocator i want to know about a person s future reassignments when making a new one so that i can decide if i still want that person to move in the future acceptance criteria given person is on product a on day when i move person from product a to product c on day and then i move person from product a to product b on day then i see a prompt to let me either cancel the reassignment scheduled for day so person remains on product b or maintain the reassignment for day so person will continue to shift to product c notes this prompt will appear and apply to all reassignments cancel or maintain for days into the future for that person
| 0
|
625,961
| 19,783,327,367
|
IssuesEvent
|
2022-01-18 01:29:59
|
robotframework/robotframework
|
https://api.github.com/repos/robotframework/robotframework
|
closed
|
New `format` option to `Log To Console` to control alignment, fill characters, and so on
|
enhancement priority: medium good first issue acknowledge
|
In some cases its look nice to print msg on center of screen with some padding
default values for new arguments
align=left
padding=' '
example use after changes
Log To Console ${msg} align=center padding=*
current workaround but it`s not easy to get actual screen width
Log To Console ${msg.center(78, '*')}
links
https://www.programiz.com/python-programming/methods/string/center
https://docs.python.org/3/library/stdtypes.html#str.center
|
1.0
|
New `format` option to `Log To Console` to control alignment, fill characters, and so on - In some cases its look nice to print msg on center of screen with some padding
default values for new arguments
align=left
padding=' '
example use after changes
Log To Console ${msg} align=center padding=*
current workaround but it`s not easy to get actual screen width
Log To Console ${msg.center(78, '*')}
links
https://www.programiz.com/python-programming/methods/string/center
https://docs.python.org/3/library/stdtypes.html#str.center
|
non_process
|
new format option to log to console to control alignment fill characters and so on in some cases its look nice to print msg on center of screen with some padding default values for new arguments align left padding example use after changes log to console msg align center padding current workaround but it s not easy to get actual screen width log to console msg center links
| 0
|
19,178
| 6,678,625,802
|
IssuesEvent
|
2017-10-05 14:48:15
|
osresearch/heads
|
https://api.github.com/repos/osresearch/heads
|
opened
|
NERF: make the Makefile.nerf board independent
|
buildsystem enhancement nerf
|
The `Makefile.nerf` has lots of R630 specific offsets and names. These should be moved to the per-board config file so that other boards are easier to support.
|
1.0
|
NERF: make the Makefile.nerf board independent - The `Makefile.nerf` has lots of R630 specific offsets and names. These should be moved to the per-board config file so that other boards are easier to support.
|
non_process
|
nerf make the makefile nerf board independent the makefile nerf has lots of specific offsets and names these should be moved to the per board config file so that other boards are easier to support
| 0
|
70,933
| 15,158,492,205
|
IssuesEvent
|
2021-02-12 01:18:13
|
jgeraigery/augment-watson-services-to-whatsapp
|
https://api.github.com/repos/jgeraigery/augment-watson-services-to-whatsapp
|
opened
|
CVE-2020-36242 (High) detected in cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl
|
security vulnerability
|
## CVE-2020-36242 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ed/b8/79858c68bafa7517c20859334ad270fe0c174a65c1ab80a9b8b377e7584b/cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ed/b8/79858c68bafa7517c20859334ad270fe0c174a65c1ab80a9b8b377e7584b/cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: augment-watson-services-to-whatsapp/backend-for-whatsapp/requirements.txt</p>
<p>Path to vulnerable library: augment-watson-services-to-whatsapp/backend-for-whatsapp/requirements.txt</p>
<p>
Dependency Hierarchy:
- pyOpenSSL-19.1.0-py2.py3-none-any.whl (Root Library)
- :x: **cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.
<p>Publish Date: 2021-02-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242>CVE-2020-36242</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst">https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst</a></p>
<p>Release Date: 2021-02-07</p>
<p>Fix Resolution: cryptography - 3.3.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"cryptography","packageVersion":"3.1.1","packageFilePaths":["/backend-for-whatsapp/requirements.txt"],"isTransitiveDependency":true,"dependencyTree":"pyOpenSSL:19.1.0;cryptography:3.1.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"cryptography - 3.3.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36242","vulnerabilityDetails":"In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-36242 (High) detected in cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl - ## CVE-2020-36242 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ed/b8/79858c68bafa7517c20859334ad270fe0c174a65c1ab80a9b8b377e7584b/cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ed/b8/79858c68bafa7517c20859334ad270fe0c174a65c1ab80a9b8b377e7584b/cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: augment-watson-services-to-whatsapp/backend-for-whatsapp/requirements.txt</p>
<p>Path to vulnerable library: augment-watson-services-to-whatsapp/backend-for-whatsapp/requirements.txt</p>
<p>
Dependency Hierarchy:
- pyOpenSSL-19.1.0-py2.py3-none-any.whl (Root Library)
- :x: **cryptography-3.1.1-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.
<p>Publish Date: 2021-02-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242>CVE-2020-36242</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst">https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst</a></p>
<p>Release Date: 2021-02-07</p>
<p>Fix Resolution: cryptography - 3.3.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"cryptography","packageVersion":"3.1.1","packageFilePaths":["/backend-for-whatsapp/requirements.txt"],"isTransitiveDependency":true,"dependencyTree":"pyOpenSSL:19.1.0;cryptography:3.1.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"cryptography - 3.3.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36242","vulnerabilityDetails":"In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in cryptography whl cve high severity vulnerability vulnerable library cryptography whl cryptography is a package which provides cryptographic recipes and primitives to python developers library home page a href path to dependency file augment watson services to whatsapp backend for whatsapp requirements txt path to vulnerable library augment watson services to whatsapp backend for whatsapp requirements txt dependency hierarchy pyopenssl none any whl root library x cryptography whl vulnerable library found in base branch master vulnerability details in the cryptography package before for python certain sequences of update calls to symmetrically encrypt multi gb values could result in an integer overflow and buffer overflow as demonstrated by the fernet class publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cryptography isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree pyopenssl cryptography isminimumfixversionavailable true minimumfixversion cryptography basebranches vulnerabilityidentifier cve vulnerabilitydetails in the cryptography package before for python certain sequences of update calls to symmetrically encrypt multi gb values could result in an integer overflow and buffer overflow as demonstrated by the fernet class vulnerabilityurl
| 0
|
20,317
| 26,960,423,948
|
IssuesEvent
|
2023-02-08 17:44:43
|
googleapis/java-iam
|
https://api.github.com/repos/googleapis/java-iam
|
closed
|
Dependency Dashboard
|
type: process api: iam priority: p4
|
This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more.
## Repository problems
These problems occurred while renovating this repository.
- WARN: RepoCacheS3.getCacheFolder() - appending missing trailing slash to pathname
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/org.apache.maven.plugins-maven-project-info-reports-plugin-3.x -->[build(deps): update dependency org.apache.maven.plugins:maven-project-info-reports-plugin to v3.4.2](../pull/595)
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-iam-policy-parent-1.x -->[chore(deps): update dependency com.google.cloud:google-iam-policy-parent to v1.8.0](../pull/602)
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-shared-dependencies-3.x -->[deps: update dependency com.google.cloud:google-cloud-shared-dependencies to v3.2.0](../pull/599)
- [ ] <!-- rebase-all-open-prs -->**Click on this checkbox to rebase all open PRs at once**
## Detected dependencies
<details><summary>github-actions</summary>
<blockquote>
<details><summary>.github/workflows/approve-readme.yaml</summary>
- `actions/github-script v6`
</details>
<details><summary>.github/workflows/auto-release.yaml</summary>
- `actions/github-script v6`
</details>
<details><summary>.github/workflows/ci.yaml</summary>
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
</details>
</blockquote>
</details>
<details><summary>maven</summary>
<blockquote>
<details><summary>google-iam-policy/pom.xml</summary>
- `com.google.cloud:google-iam-policy-parent 1.7.1-SNAPSHOT`
- `junit:junit 4.13.2`
</details>
<details><summary>pom.xml</summary>
- `com.google.cloud:google-cloud-shared-config 1.5.5`
- `com.google.cloud:google-cloud-shared-dependencies 3.1.1`
- `junit:junit 4.13.2`
- `org.apache.maven.plugins:maven-project-info-reports-plugin 3.4.1`
- `org.apache.maven.plugins:maven-javadoc-plugin 3.4.1`
</details>
</blockquote>
</details>
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more.
## Repository problems
These problems occurred while renovating this repository.
- WARN: RepoCacheS3.getCacheFolder() - appending missing trailing slash to pathname
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/org.apache.maven.plugins-maven-project-info-reports-plugin-3.x -->[build(deps): update dependency org.apache.maven.plugins:maven-project-info-reports-plugin to v3.4.2](../pull/595)
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-iam-policy-parent-1.x -->[chore(deps): update dependency com.google.cloud:google-iam-policy-parent to v1.8.0](../pull/602)
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-shared-dependencies-3.x -->[deps: update dependency com.google.cloud:google-cloud-shared-dependencies to v3.2.0](../pull/599)
- [ ] <!-- rebase-all-open-prs -->**Click on this checkbox to rebase all open PRs at once**
## Detected dependencies
<details><summary>github-actions</summary>
<blockquote>
<details><summary>.github/workflows/approve-readme.yaml</summary>
- `actions/github-script v6`
</details>
<details><summary>.github/workflows/auto-release.yaml</summary>
- `actions/github-script v6`
</details>
<details><summary>.github/workflows/ci.yaml</summary>
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
- `actions/checkout v3`
- `actions/setup-java v3`
</details>
</blockquote>
</details>
<details><summary>maven</summary>
<blockquote>
<details><summary>google-iam-policy/pom.xml</summary>
- `com.google.cloud:google-iam-policy-parent 1.7.1-SNAPSHOT`
- `junit:junit 4.13.2`
</details>
<details><summary>pom.xml</summary>
- `com.google.cloud:google-cloud-shared-config 1.5.5`
- `com.google.cloud:google-cloud-shared-dependencies 3.1.1`
- `junit:junit 4.13.2`
- `org.apache.maven.plugins:maven-project-info-reports-plugin 3.4.1`
- `org.apache.maven.plugins:maven-javadoc-plugin 3.4.1`
</details>
</blockquote>
</details>
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue lists renovate updates and detected dependencies read the docs to learn more repository problems these problems occurred while renovating this repository warn getcachefolder appending missing trailing slash to pathname open these updates have all been created already click a checkbox below to force a retry rebase of any pull pull pull click on this checkbox to rebase all open prs at once detected dependencies github actions github workflows approve readme yaml actions github script github workflows auto release yaml actions github script github workflows ci yaml actions checkout actions setup java actions checkout actions setup java actions checkout actions setup java actions checkout actions setup java actions checkout actions setup java maven google iam policy pom xml com google cloud google iam policy parent snapshot junit junit pom xml com google cloud google cloud shared config com google cloud google cloud shared dependencies junit junit org apache maven plugins maven project info reports plugin org apache maven plugins maven javadoc plugin check this box to trigger a request for renovate to run again on this repository
| 1
|
18,156
| 24,192,990,105
|
IssuesEvent
|
2022-09-23 19:42:18
|
B2o5T/graphql-eslint
|
https://api.github.com/repos/B2o5T/graphql-eslint
|
closed
|
Type error in `RuleDocsInfo`
|
kind/enhancement process/candidate
|
### Issue workflow progress
<!-- PLEASE DO NOT REMOVE THIS SECTION -->
_Progress of the issue based on the [Contributor Workflow](https://github.com/the-guild-org/Stack/blob/master/CONTRIBUTING.md#a-typical-contributor-workflow)_
- [ ] 1. The issue provides a reproduction available on GitHub, Stackblitz or CodeSandbox
> Please make sure the graphql-eslint version under `package.json` matches yours.
- [ ] 2. A failing test has been provided
- [ ] 3. A local solution has been provided
- [ ] 4. A pull request is pending review
---
**Describe the bug**
Hi 👋
This is just a small thing that I discovered when I wanted to add urls to our rules and TS didn't like it with the `strict` flag on, because `url` wasn't part of `RuleDocsInfo`.
I took a closer look and discovered that the problem is caused by `docs` being optional, and therefore the type returned by `Omit<Rule.RuleMetaData['docs'], 'category'>` is `{}`.
```
export type RuleDocsInfo<T> = {
docs: Omit<Rule.RuleMetaData['docs'], 'category'> & { <<<<<
category: CategoryType | CategoryType[];
requiresSchema?: true;
requiresSiblings?: true;
...
};
};
```
**To Reproduce**

**Expected behavior**
I think you wanted to keep all the properties from `docs` except for `category`, so that should be happening.
**Proposed solution**
[Adding](https://www.typescriptlang.org/play?#code/JYOwLgpgTgZghgYwgAgEoFcA2ECyExwAicByA3gFDLXIAmA9ggM4D8AXOVTdwPQBUfZAAco9AG7BaEJsjAALFEzn0oYOtIRRgQsMHohk9GLIXIoWFKBMoA2uewzQUgB4BdABRywYIUzY8eaUxQMAA6FQBzHgZmHntpHgBKZD4eLm5qKSZNbV19dmQmMC0QCOQAH2R0ECkYUAhaAG50jP5BJiEIBGA66WtkBThaUDLqqShkAHc5YAQ5fvjkYBlgooalg3lbeMcaiDdPb19-QKZg8HCoKJimOItb5NSW7gQSCAiVAE8CopKyyrGEDqIAazQyNDaSxk03wCgmW2QAAMAET7SA1JjIjios4hNhQLr0AC2RIge1oyMRwlEnVUnw2yDgyBsCH0dQi6CgJD0Bjq2A8Xh8fgCQRCl2ujFu6CY0AAtBzJBAeKyQOzOSMAMRosnDUqylVqrl5ECyvnSZJkuAAIwcCwsKTS4OoBNZJJ1DQKVvo9GwcAMAL2wNBzwhAkKnW6vRkCIAqqgADKMtTTWbzBEwLCYOiMdCk8Dc-TIV4GK0oRBIJgy2gOkPUTmYH7FEYVKqB+pNWvISEdLo9YB9GFbCY7It+sz4TkGJjoCIRaTGmTuWpwLBgaP0ZDwTAypbGYnAbwNR6Op3T2fznme72+-2t2rtsE0AC+LVJlbgc9YHDIzLfTA-EAAJK0BwvwjK4oFNqUyBPi2gJBh23B1M41rYAUyKslIyItsiKaQB0iAQNhAb3iCiE0HIcBMAAyjOn4LlePoQGOJFAg+FAvhQYCfJ0yAAHIQJMAAqPEoAAvMgADyRIHgAPKgEAAI7oMABK0PJFh4AQxAEAAfDYADkNwGa4AA0yAGa8kAfFAnwGbpQA) `Required` solves it.

**Environment:**
- OS: MacOS Monterey
- `@graphql-eslint/eslint-plugin`: 3.10.7
- Node.js: 16
|
1.0
|
Type error in `RuleDocsInfo` - ### Issue workflow progress
<!-- PLEASE DO NOT REMOVE THIS SECTION -->
_Progress of the issue based on the [Contributor Workflow](https://github.com/the-guild-org/Stack/blob/master/CONTRIBUTING.md#a-typical-contributor-workflow)_
- [ ] 1. The issue provides a reproduction available on GitHub, Stackblitz or CodeSandbox
> Please make sure the graphql-eslint version under `package.json` matches yours.
- [ ] 2. A failing test has been provided
- [ ] 3. A local solution has been provided
- [ ] 4. A pull request is pending review
---
**Describe the bug**
Hi 👋
This is just a small thing that I discovered when I wanted to add urls to our rules and TS didn't like it with the `strict` flag on, because `url` wasn't part of `RuleDocsInfo`.
I took a closer look and discovered that the problem is caused by `docs` being optional, and therefore the type returned by `Omit<Rule.RuleMetaData['docs'], 'category'>` is `{}`.
```
export type RuleDocsInfo<T> = {
docs: Omit<Rule.RuleMetaData['docs'], 'category'> & { <<<<<
category: CategoryType | CategoryType[];
requiresSchema?: true;
requiresSiblings?: true;
...
};
};
```
**To Reproduce**

**Expected behavior**
I think you wanted to keep all the properties from `docs` except for `category`, so that should be happening.
**Proposed solution**
[Adding](https://www.typescriptlang.org/play?#code/JYOwLgpgTgZghgYwgAgEoFcA2ECyExwAicByA3gFDLXIAmA9ggM4D8AXOVTdwPQBUfZAAco9AG7BaEJsjAALFEzn0oYOtIRRgQsMHohk9GLIXIoWFKBMoA2uewzQUgB4BdABRywYIUzY8eaUxQMAA6FQBzHgZmHntpHgBKZD4eLm5qKSZNbV19dmQmMC0QCOQAH2R0ECkYUAhaAG50jP5BJiEIBGA66WtkBThaUDLqqShkAHc5YAQ5fvjkYBlgooalg3lbeMcaiDdPb19-QKZg8HCoKJimOItb5NSW7gQSCAiVAE8CopKyyrGEDqIAazQyNDaSxk03wCgmW2QAAMAET7SA1JjIjios4hNhQLr0AC2RIge1oyMRwlEnVUnw2yDgyBsCH0dQi6CgJD0Bjq2A8Xh8fgCQRCl2ujFu6CY0AAtBzJBAeKyQOzOSMAMRosnDUqylVqrl5ECyvnSZJkuAAIwcCwsKTS4OoBNZJJ1DQKVvo9GwcAMAL2wNBzwhAkKnW6vRkCIAqqgADKMtTTWbzBEwLCYOiMdCk8Dc-TIV4GK0oRBIJgy2gOkPUTmYH7FEYVKqB+pNWvISEdLo9YB9GFbCY7It+sz4TkGJjoCIRaTGmTuWpwLBgaP0ZDwTAypbGYnAbwNR6Op3T2fznme72+-2t2rtsE0AC+LVJlbgc9YHDIzLfTA-EAAJK0BwvwjK4oFNqUyBPi2gJBh23B1M41rYAUyKslIyItsiKaQB0iAQNhAb3iCiE0HIcBMAAyjOn4LlePoQGOJFAg+FAvhQYCfJ0yAAHIQJMAAqPEoAAvMgADyRIHgAPKgEAAI7oMABK0PJFh4AQxAEAAfDYADkNwGa4AA0yAGa8kAfFAnwGbpQA) `Required` solves it.

**Environment:**
- OS: MacOS Monterey
- `@graphql-eslint/eslint-plugin`: 3.10.7
- Node.js: 16
|
process
|
type error in ruledocsinfo issue workflow progress progress of the issue based on the the issue provides a reproduction available on github stackblitz or codesandbox please make sure the graphql eslint version under package json matches yours a failing test has been provided a local solution has been provided a pull request is pending review describe the bug hi 👋 this is just a small thing that i discovered when i wanted to add urls to our rules and ts didn t like it with the strict flag on because url wasn t part of ruledocsinfo i took a closer look and discovered that the problem is caused by docs being optional and therefore the type returned by omit is export type ruledocsinfo docs omit category categorytype categorytype requiresschema true requiressiblings true to reproduce expected behavior i think you wanted to keep all the properties from docs except for category so that should be happening proposed solution required solves it environment os macos monterey graphql eslint eslint plugin node js
| 1
|
2,979
| 5,965,488,694
|
IssuesEvent
|
2017-05-30 11:45:34
|
Hurence/logisland
|
https://api.github.com/repos/Hurence/logisland
|
closed
|
add multiGet elastic search processor
|
feature processor
|
# Expected behavior and actual behavior.
# Steps to reproduce the problem.
# Specifications like the version of the project, operating system, or hardware.
|
1.0
|
add multiGet elastic search processor - # Expected behavior and actual behavior.
# Steps to reproduce the problem.
# Specifications like the version of the project, operating system, or hardware.
|
process
|
add multiget elastic search processor expected behavior and actual behavior steps to reproduce the problem specifications like the version of the project operating system or hardware
| 1
|
142,763
| 19,102,990,749
|
IssuesEvent
|
2021-11-30 01:52:43
|
Nehamaefi/Efigit
|
https://api.github.com/repos/Nehamaefi/Efigit
|
closed
|
CVE-2019-12814 (Medium) detected in jackson-databind-2.9.4.jar - autoclosed
|
security vulnerability
|
## CVE-2019-12814 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Efigit/apps/rest-showcase/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.4/jackson-databind-2.9.4.jar,epository/com/fasterxml/jackson/core/jackson-databind/2.9.4/jackson-databind-2.9.4.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Nehamaefi/Efigit/commit/72c969e9db891da76ca4ea40803b5d450c212b27">72c969e9db891da76ca4ea40803b5d450c212b27</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.
<p>Publish Date: 2019-06-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814>CVE-2019-12814</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2341">https://github.com/FasterXML/jackson-databind/issues/2341</a></p>
<p>Release Date: 2019-06-19</p>
<p>Fix Resolution: 2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0"}],"vulnerabilityIdentifier":"CVE-2019-12814","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2019-12814 (Medium) detected in jackson-databind-2.9.4.jar - autoclosed - ## CVE-2019-12814 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Efigit/apps/rest-showcase/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.4/jackson-databind-2.9.4.jar,epository/com/fasterxml/jackson/core/jackson-databind/2.9.4/jackson-databind-2.9.4.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Nehamaefi/Efigit/commit/72c969e9db891da76ca4ea40803b5d450c212b27">72c969e9db891da76ca4ea40803b5d450c212b27</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.
<p>Publish Date: 2019-06-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814>CVE-2019-12814</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2341">https://github.com/FasterXML/jackson-databind/issues/2341</a></p>
<p>Release Date: 2019-06-19</p>
<p>Fix Resolution: 2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.7.9.6, 2.8.11.4, 2.9.9.1, 2.10.0"}],"vulnerabilityIdentifier":"CVE-2019-12814","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x through 2.9.9. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has JDOM 1.x or 2.x jar in the classpath, an attacker can send a specifically crafted JSON message that allows them to read arbitrary local files on the server.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-12814","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jackson databind jar autoclosed cve medium severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file efigit apps rest showcase pom xml path to vulnerable library root repository com fasterxml jackson core jackson databind jackson databind jar epository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind x through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has jdom x or x jar in the classpath an attacker can send a specifically crafted json message that allows them to read arbitrary local files on the server publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a polymorphic typing issue was discovered in fasterxml jackson databind x through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has jdom x or x jar in the classpath an attacker can send a specifically crafted json message that allows them to read arbitrary local files on the server vulnerabilityurl
| 0
|
20,270
| 26,900,029,999
|
IssuesEvent
|
2023-02-06 15:04:05
|
AvaloniaUI/Avalonia
|
https://api.github.com/repos/AvaloniaUI/Avalonia
|
closed
|
Underline, Strikethrough et al. Broken in TextBlock
|
bug release-blocker area-textprocessing
|
**Describe the bug**
Looks like underline, strikethrough and baseline text decoration is broken in current master for TextBlock.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to ControlCatalog TextBlock page
**Expected behavior**
Mentioned decorations should render in the middle or below text.
**Screenshots**

**Desktop (please complete the following information):**
- OS: Windows 10
- Version Master f38b7b223e889d38ba3e04b6aa18275ce79b3ffb
**Additional context**
Add any other context about the problem here.
|
1.0
|
Underline, Strikethrough et al. Broken in TextBlock - **Describe the bug**
Looks like underline, strikethrough and baseline text decoration is broken in current master for TextBlock.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to ControlCatalog TextBlock page
**Expected behavior**
Mentioned decorations should render in the middle or below text.
**Screenshots**

**Desktop (please complete the following information):**
- OS: Windows 10
- Version Master f38b7b223e889d38ba3e04b6aa18275ce79b3ffb
**Additional context**
Add any other context about the problem here.
|
process
|
underline strikethrough et al broken in textblock describe the bug looks like underline strikethrough and baseline text decoration is broken in current master for textblock to reproduce steps to reproduce the behavior go to controlcatalog textblock page expected behavior mentioned decorations should render in the middle or below text screenshots desktop please complete the following information os windows version master additional context add any other context about the problem here
| 1
|
81,579
| 10,150,219,836
|
IssuesEvent
|
2019-08-05 17:04:28
|
OpenLiberty/open-liberty
|
https://api.github.com/repos/OpenLiberty/open-liberty
|
closed
|
Introspector for Overlay Containers
|
design-issue
|
Add an introspector which will show currently active root overlay containers, including the URLs of the base container and the URLs of the directory overlay container, and including a listing of data stored in the non-persistent cache of the overlay container. (Adaptable containers implement a non-persistent cache on overlay containers.)
The listing of active overlay containers provides a top-level view of root application containers.
The listing of data from the non-persistent cache shows key application data which has been generated for applications, since each application has a root adaptable container, which has a root overlay container and a non-persistent cache within the overlay container.
|
1.0
|
Introspector for Overlay Containers - Add an introspector which will show currently active root overlay containers, including the URLs of the base container and the URLs of the directory overlay container, and including a listing of data stored in the non-persistent cache of the overlay container. (Adaptable containers implement a non-persistent cache on overlay containers.)
The listing of active overlay containers provides a top-level view of root application containers.
The listing of data from the non-persistent cache shows key application data which has been generated for applications, since each application has a root adaptable container, which has a root overlay container and a non-persistent cache within the overlay container.
|
non_process
|
introspector for overlay containers add an introspector which will show currently active root overlay containers including the urls of the base container and the urls of the directory overlay container and including a listing of data stored in the non persistent cache of the overlay container adaptable containers implement a non persistent cache on overlay containers the listing of active overlay containers provides a top level view of root application containers the listing of data from the non persistent cache shows key application data which has been generated for applications since each application has a root adaptable container which has a root overlay container and a non persistent cache within the overlay container
| 0
|
122,800
| 4,845,398,482
|
IssuesEvent
|
2016-11-10 08:02:29
|
myui/hivemall
|
https://api.github.com/repos/myui/hivemall
|
opened
|
IP-clearance (SGA) from TD contributors
|
discussions high-priority
|
To Hivemall contributors from the current and past Treasure Data's employees (@naritta @L3Sota @takuti @amaya382 @NaokiStones @ryukobayashi )
[In the process](http://incubator.apache.org/guides/mentor.html#initial-ip-clearance) of Apache Incubation, we need to clear all IPs of Hivemall.
https://issues.apache.org/jira/browse/HIVEMALL-9
Instead of collecting [SGAs](https://www.apache.org/licenses/software-grant-template.pdf) from each one of you, I'm considering to collect the [attached SGA](https://github.com/myui/hivemall/files/582724/software-grant-td.pdf) from Treasure Data assuming the company holds all Copyrights of your contributions to Hivemall for your works at Treasure Data.
If you agree with it, please reply in this thread. _I need your reply to confirm it for an evidence._
Alternatively, you can be a committer of Hivemall signing to I-CLA [3] though [community voting is required](http://incubator.apache.org/guides/ppmc.html) to be a committer. BTW, I'll invite some of you as committers because there are some [pending PRs](https://github.com/myui/hivemall/pulls) that is not yet merged and needs your help for merging them.
Thanks in advance,
Makoto
|
1.0
|
IP-clearance (SGA) from TD contributors - To Hivemall contributors from the current and past Treasure Data's employees (@naritta @L3Sota @takuti @amaya382 @NaokiStones @ryukobayashi )
[In the process](http://incubator.apache.org/guides/mentor.html#initial-ip-clearance) of Apache Incubation, we need to clear all IPs of Hivemall.
https://issues.apache.org/jira/browse/HIVEMALL-9
Instead of collecting [SGAs](https://www.apache.org/licenses/software-grant-template.pdf) from each one of you, I'm considering to collect the [attached SGA](https://github.com/myui/hivemall/files/582724/software-grant-td.pdf) from Treasure Data assuming the company holds all Copyrights of your contributions to Hivemall for your works at Treasure Data.
If you agree with it, please reply in this thread. _I need your reply to confirm it for an evidence._
Alternatively, you can be a committer of Hivemall signing to I-CLA [3] though [community voting is required](http://incubator.apache.org/guides/ppmc.html) to be a committer. BTW, I'll invite some of you as committers because there are some [pending PRs](https://github.com/myui/hivemall/pulls) that is not yet merged and needs your help for merging them.
Thanks in advance,
Makoto
|
non_process
|
ip clearance sga from td contributors to hivemall contributors from the current and past treasure data s employees naritta takuti naokistones ryukobayashi of apache incubation we need to clear all ips of hivemall instead of collecting from each one of you i m considering to collect the from treasure data assuming the company holds all copyrights of your contributions to hivemall for your works at treasure data if you agree with it please reply in this thread i need your reply to confirm it for an evidence alternatively you can be a committer of hivemall signing to i cla though to be a committer btw i ll invite some of you as committers because there are some that is not yet merged and needs your help for merging them thanks in advance makoto
| 0
|
33,427
| 7,715,258,660
|
IssuesEvent
|
2018-05-23 06:56:39
|
zeebe-io/zeebe
|
https://api.github.com/repos/zeebe-io/zeebe
|
closed
|
If a node leaves the cluster but is not know by the current node a NullPointerException is thrown
|
bug code gossip ready
|
```
13:20:40.424 [actor-runner-broker] INFO io.zeebe.gossip - Remove member 'localhost:41016', status = LEAVE, gossip-term: [epoch=1517573756451, heartBeat=2]
13:20:40.426 [actor-runner-broker] DEBUG io.zeebe.broker.clustering - Remove member null from member list.
java.lang.NullPointerException
at io.zeebe.broker.clustering.management.memberList.ClusterMemberListManager$MembershipListener.lambda$onRemove$1(ClusterMemberListManager.java:154)
at org.agrona.concurrent.ManyToOneConcurrentArrayQueue.drain(ManyToOneConcurrentArrayQueue.java:113)
at org.agrona.concurrent.ManyToOneConcurrentArrayQueue.drain(ManyToOneConcurrentArrayQueue.java:88)
at io.zeebe.util.DeferredCommandContext.doWork(DeferredCommandContext.java:66)
at io.zeebe.broker.clustering.management.memberList.ClusterMemberListManager.doWork(ClusterMemberListManager.java:101)
at io.zeebe.broker.clustering.management.ClusterManager.doWork(ClusterManager.java:189)
at io.zeebe.util.actor.ActorRunner.tryRunActor(ActorRunner.java:180)
at io.zeebe.util.actor.ActorRunner.runActor(ActorRunner.java:153)
at io.zeebe.util.actor.ActorRunner.doWork(ActorRunner.java:129)
at io.zeebe.util.actor.ActorRunner.doWorkUntilClose(ActorRunner.java:86)
at io.zeebe.util.LogUtil.doWithMDC(LogUtil.java:34)
at io.zeebe.util.actor.ActorRunner.run(ActorRunner.java:77)
at java.lang.Thread.run(Thread.java:748)
```
|
1.0
|
If a node leaves the cluster but is not know by the current node a NullPointerException is thrown - ```
13:20:40.424 [actor-runner-broker] INFO io.zeebe.gossip - Remove member 'localhost:41016', status = LEAVE, gossip-term: [epoch=1517573756451, heartBeat=2]
13:20:40.426 [actor-runner-broker] DEBUG io.zeebe.broker.clustering - Remove member null from member list.
java.lang.NullPointerException
at io.zeebe.broker.clustering.management.memberList.ClusterMemberListManager$MembershipListener.lambda$onRemove$1(ClusterMemberListManager.java:154)
at org.agrona.concurrent.ManyToOneConcurrentArrayQueue.drain(ManyToOneConcurrentArrayQueue.java:113)
at org.agrona.concurrent.ManyToOneConcurrentArrayQueue.drain(ManyToOneConcurrentArrayQueue.java:88)
at io.zeebe.util.DeferredCommandContext.doWork(DeferredCommandContext.java:66)
at io.zeebe.broker.clustering.management.memberList.ClusterMemberListManager.doWork(ClusterMemberListManager.java:101)
at io.zeebe.broker.clustering.management.ClusterManager.doWork(ClusterManager.java:189)
at io.zeebe.util.actor.ActorRunner.tryRunActor(ActorRunner.java:180)
at io.zeebe.util.actor.ActorRunner.runActor(ActorRunner.java:153)
at io.zeebe.util.actor.ActorRunner.doWork(ActorRunner.java:129)
at io.zeebe.util.actor.ActorRunner.doWorkUntilClose(ActorRunner.java:86)
at io.zeebe.util.LogUtil.doWithMDC(LogUtil.java:34)
at io.zeebe.util.actor.ActorRunner.run(ActorRunner.java:77)
at java.lang.Thread.run(Thread.java:748)
```
|
non_process
|
if a node leaves the cluster but is not know by the current node a nullpointerexception is thrown info io zeebe gossip remove member localhost status leave gossip term debug io zeebe broker clustering remove member null from member list java lang nullpointerexception at io zeebe broker clustering management memberlist clustermemberlistmanager membershiplistener lambda onremove clustermemberlistmanager java at org agrona concurrent manytooneconcurrentarrayqueue drain manytooneconcurrentarrayqueue java at org agrona concurrent manytooneconcurrentarrayqueue drain manytooneconcurrentarrayqueue java at io zeebe util deferredcommandcontext dowork deferredcommandcontext java at io zeebe broker clustering management memberlist clustermemberlistmanager dowork clustermemberlistmanager java at io zeebe broker clustering management clustermanager dowork clustermanager java at io zeebe util actor actorrunner tryrunactor actorrunner java at io zeebe util actor actorrunner runactor actorrunner java at io zeebe util actor actorrunner dowork actorrunner java at io zeebe util actor actorrunner doworkuntilclose actorrunner java at io zeebe util logutil dowithmdc logutil java at io zeebe util actor actorrunner run actorrunner java at java lang thread run thread java
| 0
|
307,875
| 26,569,177,572
|
IssuesEvent
|
2023-01-21 00:29:15
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
closed
|
Fix jax_numpy_creation.test_jax_numpy_ones
|
JAX Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768439780" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768451977" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768447530" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768457852" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
1.0
|
Fix jax_numpy_creation.test_jax_numpy_ones - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768439780" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768451977" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768447530" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/3952846902/jobs/6768457852" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
non_process
|
fix jax numpy creation test jax numpy ones tensorflow img src torch img src numpy img src jax img src not found not found not found not found not found not found not found not found
| 0
|
254,086
| 27,348,944,204
|
IssuesEvent
|
2023-02-27 08:05:40
|
uriel-naor/ISSUES
|
https://api.github.com/repos/uriel-naor/ISSUES
|
closed
|
express-3.0.1.tgz: 11 vulnerabilities (highest severity is: 7.5) - autoclosed
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>express-3.0.1.tgz</b></p></summary>
<p>Sinatra inspired web development framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/express/-/express-3.0.1.tgz">https://registry.npmjs.org/express/-/express-3.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/express/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (express version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2017-16138](https://www.mend.io/vulnerability-database/CVE-2017-16138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | mime-1.2.6.tgz | Transitive | N/A* | ❌ |
| [CVE-2017-16119](https://www.mend.io/vulnerability-database/CVE-2017-16119) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | fresh-0.1.0.tgz | Transitive | N/A* | ❌ |
| [CVE-2014-6394](https://www.mend.io/vulnerability-database/CVE-2014-6394) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | send-0.1.0.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2013-7370](https://www.mend.io/vulnerability-database/CVE-2013-7370) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2014-6393](https://www.mend.io/vulnerability-database/CVE-2014-6393) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | express-3.0.1.tgz | Direct | 3.21.0 | ✅ |
| [WS-2013-0004](https://github.com/senchalabs/connect/commit/126187c4e12162e231b87350740045e5bb06e93a) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2013-7371](https://www.mend.io/vulnerability-database/CVE-2013-7371) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2018-3717](https://www.mend.io/vulnerability-database/CVE-2018-3717) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2015-8859](https://www.mend.io/vulnerability-database/CVE-2015-8859) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | send-0.1.0.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2014-7191](https://www.mend.io/vulnerability-database/CVE-2014-7191) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | qs-0.5.1.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2016-1000236](https://www.mend.io/vulnerability-database/CVE-2016-1000236) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.4 | cookie-signature-0.0.1.tgz | Transitive | 3.21.0 | ✅ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-16138</summary>
### Vulnerable Library - <b>mime-1.2.6.tgz</b></p>
<p>A comprehensive library for mime-type mapping</p>
<p>Library home page: <a href="https://registry.npmjs.org/mime/-/mime-1.2.6.tgz">https://registry.npmjs.org/mime/-/mime-1.2.6.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mime/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- send-0.1.0.tgz
- :x: **mime-1.2.6.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The mime module < 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16138>CVE-2017-16138</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 1.4.1,2.0.3</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-16119</summary>
### Vulnerable Library - <b>fresh-0.1.0.tgz</b></p>
<p>HTTP response freshness testing</p>
<p>Library home page: <a href="https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz">https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/fresh/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **fresh-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Fresh is a module used by the Express.js framework for HTTP response freshness testing. It is vulnerable to a regular expression denial of service when it is passed specially crafted input to parse. This causes the event loop to be blocked causing a denial of service condition.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16119>CVE-2017-16119</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/526">https://www.npmjs.com/advisories/526</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: fresh - 0.5.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2014-6394</summary>
### Vulnerable Library - <b>send-0.1.0.tgz</b></p>
<p>Better streaming static file server with Range and conditional-GET support</p>
<p>Library home page: <a href="https://registry.npmjs.org/send/-/send-0.1.0.tgz">https://registry.npmjs.org/send/-/send-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/send/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **send-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
visionmedia send before 0.8.4 for Node.js uses a partial comparison for verifying whether a directory is within the document root, which allows remote attackers to access restricted directories, as demonstrated using "public-restricted" under a "public" directory.
<p>Publish Date: 2014-10-08
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-6394>CVE-2014-6394</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-6394">https://nvd.nist.gov/vuln/detail/CVE-2014-6394</a></p>
<p>Release Date: 2014-10-08</p>
<p>Fix Resolution (send): 0.8.4</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2013-7370</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-connect before 2.8.1 has XSS in the Sencha Labs Connect middleware
<p>Publish Date: 2019-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2013-7370>CVE-2013-7370</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7370">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7370</a></p>
<p>Release Date: 2019-12-11</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2014-6393</summary>
### Vulnerable Library - <b>express-3.0.1.tgz</b></p>
<p>Sinatra inspired web development framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/express/-/express-3.0.1.tgz">https://registry.npmjs.org/express/-/express-3.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/express/package.json</p>
<p>
Dependency Hierarchy:
- :x: **express-3.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Express web framework before 3.11 and 4.x before 4.5 for Node.js does not provide a charset field in HTTP Content-Type headers in 400 level responses, which might allow remote attackers to conduct cross-site scripting (XSS) attacks via characters in a non-standard encoding.
<p>Publish Date: 2017-08-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-6393>CVE-2014-6393</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-6393">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-6393</a></p>
<p>Release Date: 2017-08-09</p>
<p>Fix Resolution (express): express - 3.11.0, 4.5.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> WS-2013-0004</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The "methodOverride" let the http post to override the method of the request with the value of the post key or with the header, which allows XSS attack.
<p>Publish Date: 2013-06-27
<p>URL: <a href=https://github.com/senchalabs/connect/commit/126187c4e12162e231b87350740045e5bb06e93a>WS-2013-0004</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2013-06-27</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2013-7371</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-connects before 2.8.2 has cross site scripting in Sencha Labs Connect middleware (vulnerability due to incomplete fix for CVE-2013-7370)
<p>Publish Date: 2019-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2013-7371>CVE-2013-7371</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7371">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7371</a></p>
<p>Release Date: 2019-12-11</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-3717</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
connect node module before 2.14.0 suffers from a Cross-Site Scripting (XSS) vulnerability due to a lack of validation of file in directory.js middleware.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-3717>CVE-2018-3717</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3717">https://nvd.nist.gov/vuln/detail/CVE-2018-3717</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution (connect): 2.14.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2015-8859</summary>
### Vulnerable Library - <b>send-0.1.0.tgz</b></p>
<p>Better streaming static file server with Range and conditional-GET support</p>
<p>Library home page: <a href="https://registry.npmjs.org/send/-/send-0.1.0.tgz">https://registry.npmjs.org/send/-/send-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/send/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **send-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The send package before 0.11.1 for Node.js allows attackers to obtain the root path via unspecified vectors.
<p>Publish Date: 2017-01-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8859>CVE-2015-8859</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-8859">https://nvd.nist.gov/vuln/detail/CVE-2015-8859</a></p>
<p>Release Date: 2017-01-23</p>
<p>Fix Resolution (send): 0.11.1</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2014-7191</summary>
### Vulnerable Library - <b>qs-0.5.1.tgz</b></p>
<p>querystring parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/qs/-/qs-0.5.1.tgz">https://registry.npmjs.org/qs/-/qs-0.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/qs/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- connect-2.6.2.tgz
- :x: **qs-0.5.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The qs module before 1.0.0 in Node.js does not call the compact function for array data, which allows remote attackers to cause a denial of service (memory consumption) by using a large index value to create a sparse array.
<p>Publish Date: 2014-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-7191>CVE-2014-7191</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-7191">https://nvd.nist.gov/vuln/detail/CVE-2014-7191</a></p>
<p>Release Date: 2014-10-19</p>
<p>Fix Resolution (qs): 1.0.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2016-1000236</summary>
### Vulnerable Library - <b>cookie-signature-0.0.1.tgz</b></p>
<p>Sign and unsign cookies</p>
<p>Library home page: <a href="https://registry.npmjs.org/cookie-signature/-/cookie-signature-0.0.1.tgz">https://registry.npmjs.org/cookie-signature/-/cookie-signature-0.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/cookie-signature/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **cookie-signature-0.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Node-cookie-signature before 1.0.6 is affected by a timing attack due to the type of comparison used.
<p>Publish Date: 2019-11-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-1000236>CVE-2016-1000236</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-92vm-wfm5-mxvv">https://github.com/advisories/GHSA-92vm-wfm5-mxvv</a></p>
<p>Release Date: 2019-11-19</p>
<p>Fix Resolution (cookie-signature): cookie-signature - 1.0.4</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
express-3.0.1.tgz: 11 vulnerabilities (highest severity is: 7.5) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>express-3.0.1.tgz</b></p></summary>
<p>Sinatra inspired web development framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/express/-/express-3.0.1.tgz">https://registry.npmjs.org/express/-/express-3.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/express/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (express version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2017-16138](https://www.mend.io/vulnerability-database/CVE-2017-16138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | mime-1.2.6.tgz | Transitive | N/A* | ❌ |
| [CVE-2017-16119](https://www.mend.io/vulnerability-database/CVE-2017-16119) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | fresh-0.1.0.tgz | Transitive | N/A* | ❌ |
| [CVE-2014-6394](https://www.mend.io/vulnerability-database/CVE-2014-6394) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | send-0.1.0.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2013-7370](https://www.mend.io/vulnerability-database/CVE-2013-7370) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2014-6393](https://www.mend.io/vulnerability-database/CVE-2014-6393) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | express-3.0.1.tgz | Direct | 3.21.0 | ✅ |
| [WS-2013-0004](https://github.com/senchalabs/connect/commit/126187c4e12162e231b87350740045e5bb06e93a) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2013-7371](https://www.mend.io/vulnerability-database/CVE-2013-7371) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2018-3717](https://www.mend.io/vulnerability-database/CVE-2018-3717) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | connect-2.6.2.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2015-8859](https://www.mend.io/vulnerability-database/CVE-2015-8859) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | send-0.1.0.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2014-7191](https://www.mend.io/vulnerability-database/CVE-2014-7191) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | qs-0.5.1.tgz | Transitive | 3.21.0 | ✅ |
| [CVE-2016-1000236](https://www.mend.io/vulnerability-database/CVE-2016-1000236) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.4 | cookie-signature-0.0.1.tgz | Transitive | 3.21.0 | ✅ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-16138</summary>
### Vulnerable Library - <b>mime-1.2.6.tgz</b></p>
<p>A comprehensive library for mime-type mapping</p>
<p>Library home page: <a href="https://registry.npmjs.org/mime/-/mime-1.2.6.tgz">https://registry.npmjs.org/mime/-/mime-1.2.6.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mime/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- send-0.1.0.tgz
- :x: **mime-1.2.6.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The mime module < 1.4.1, 2.0.1, 2.0.2 is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16138>CVE-2017-16138</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16138</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 1.4.1,2.0.3</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-16119</summary>
### Vulnerable Library - <b>fresh-0.1.0.tgz</b></p>
<p>HTTP response freshness testing</p>
<p>Library home page: <a href="https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz">https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/fresh/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **fresh-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Fresh is a module used by the Express.js framework for HTTP response freshness testing. It is vulnerable to a regular expression denial of service when it is passed specially crafted input to parse. This causes the event loop to be blocked causing a denial of service condition.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16119>CVE-2017-16119</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/526">https://www.npmjs.com/advisories/526</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: fresh - 0.5.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2014-6394</summary>
### Vulnerable Library - <b>send-0.1.0.tgz</b></p>
<p>Better streaming static file server with Range and conditional-GET support</p>
<p>Library home page: <a href="https://registry.npmjs.org/send/-/send-0.1.0.tgz">https://registry.npmjs.org/send/-/send-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/send/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **send-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
visionmedia send before 0.8.4 for Node.js uses a partial comparison for verifying whether a directory is within the document root, which allows remote attackers to access restricted directories, as demonstrated using "public-restricted" under a "public" directory.
<p>Publish Date: 2014-10-08
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-6394>CVE-2014-6394</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-6394">https://nvd.nist.gov/vuln/detail/CVE-2014-6394</a></p>
<p>Release Date: 2014-10-08</p>
<p>Fix Resolution (send): 0.8.4</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2013-7370</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-connect before 2.8.1 has XSS in the Sencha Labs Connect middleware
<p>Publish Date: 2019-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2013-7370>CVE-2013-7370</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7370">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7370</a></p>
<p>Release Date: 2019-12-11</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2014-6393</summary>
### Vulnerable Library - <b>express-3.0.1.tgz</b></p>
<p>Sinatra inspired web development framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/express/-/express-3.0.1.tgz">https://registry.npmjs.org/express/-/express-3.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/express/package.json</p>
<p>
Dependency Hierarchy:
- :x: **express-3.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Express web framework before 3.11 and 4.x before 4.5 for Node.js does not provide a charset field in HTTP Content-Type headers in 400 level responses, which might allow remote attackers to conduct cross-site scripting (XSS) attacks via characters in a non-standard encoding.
<p>Publish Date: 2017-08-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-6393>CVE-2014-6393</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-6393">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-6393</a></p>
<p>Release Date: 2017-08-09</p>
<p>Fix Resolution (express): express - 3.11.0, 4.5.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> WS-2013-0004</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The "methodOverride" let the http post to override the method of the request with the value of the post key or with the header, which allows XSS attack.
<p>Publish Date: 2013-06-27
<p>URL: <a href=https://github.com/senchalabs/connect/commit/126187c4e12162e231b87350740045e5bb06e93a>WS-2013-0004</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2013-06-27</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2013-7371</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-connects before 2.8.2 has cross site scripting in Sencha Labs Connect middleware (vulnerability due to incomplete fix for CVE-2013-7370)
<p>Publish Date: 2019-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2013-7371>CVE-2013-7371</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7371">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2013-7371</a></p>
<p>Release Date: 2019-12-11</p>
<p>Fix Resolution (connect): 2.8.2</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-3717</summary>
### Vulnerable Library - <b>connect-2.6.2.tgz</b></p>
<p>High performance middleware framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/connect/-/connect-2.6.2.tgz">https://registry.npmjs.org/connect/-/connect-2.6.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/connect/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **connect-2.6.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
connect node module before 2.14.0 suffers from a Cross-Site Scripting (XSS) vulnerability due to a lack of validation of file in directory.js middleware.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-3717>CVE-2018-3717</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3717">https://nvd.nist.gov/vuln/detail/CVE-2018-3717</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution (connect): 2.14.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2015-8859</summary>
### Vulnerable Library - <b>send-0.1.0.tgz</b></p>
<p>Better streaming static file server with Range and conditional-GET support</p>
<p>Library home page: <a href="https://registry.npmjs.org/send/-/send-0.1.0.tgz">https://registry.npmjs.org/send/-/send-0.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/send/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **send-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The send package before 0.11.1 for Node.js allows attackers to obtain the root path via unspecified vectors.
<p>Publish Date: 2017-01-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8859>CVE-2015-8859</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-8859">https://nvd.nist.gov/vuln/detail/CVE-2015-8859</a></p>
<p>Release Date: 2017-01-23</p>
<p>Fix Resolution (send): 0.11.1</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2014-7191</summary>
### Vulnerable Library - <b>qs-0.5.1.tgz</b></p>
<p>querystring parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/qs/-/qs-0.5.1.tgz">https://registry.npmjs.org/qs/-/qs-0.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/qs/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- connect-2.6.2.tgz
- :x: **qs-0.5.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The qs module before 1.0.0 in Node.js does not call the compact function for array data, which allows remote attackers to cause a denial of service (memory consumption) by using a large index value to create a sparse array.
<p>Publish Date: 2014-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2014-7191>CVE-2014-7191</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-7191">https://nvd.nist.gov/vuln/detail/CVE-2014-7191</a></p>
<p>Release Date: 2014-10-19</p>
<p>Fix Resolution (qs): 1.0.0</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2016-1000236</summary>
### Vulnerable Library - <b>cookie-signature-0.0.1.tgz</b></p>
<p>Sign and unsign cookies</p>
<p>Library home page: <a href="https://registry.npmjs.org/cookie-signature/-/cookie-signature-0.0.1.tgz">https://registry.npmjs.org/cookie-signature/-/cookie-signature-0.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/cookie-signature/package.json</p>
<p>
Dependency Hierarchy:
- express-3.0.1.tgz (Root Library)
- :x: **cookie-signature-0.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uriel-naor/ISSUES/commit/c6ee538a7e7096e6b666146190edf80cc89a4ca9">c6ee538a7e7096e6b666146190edf80cc89a4ca9</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Node-cookie-signature before 1.0.6 is affected by a timing attack due to the type of comparison used.
<p>Publish Date: 2019-11-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-1000236>CVE-2016-1000236</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-92vm-wfm5-mxvv">https://github.com/advisories/GHSA-92vm-wfm5-mxvv</a></p>
<p>Release Date: 2019-11-19</p>
<p>Fix Resolution (cookie-signature): cookie-signature - 1.0.4</p>
<p>Direct dependency fix Resolution (express): 3.21.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
express tgz vulnerabilities highest severity is autoclosed vulnerable library express tgz sinatra inspired web development framework library home page a href path to dependency file package json path to vulnerable library node modules express package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in express version fix pr available high mime tgz transitive n a high fresh tgz transitive n a high send tgz transitive medium connect tgz transitive medium express tgz direct medium connect tgz transitive medium connect tgz transitive medium connect tgz transitive medium send tgz transitive medium qs tgz transitive medium cookie signature tgz transitive for some transitive vulnerabilities there is no version of direct dependency with a fix check the section details below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library mime tgz a comprehensive library for mime type mapping library home page a href path to dependency file package json path to vulnerable library node modules mime package json dependency hierarchy express tgz root library send tgz x mime tgz vulnerable library found in head commit a href found in base branch main vulnerability details the mime module is vulnerable to regular expression denial of service when a mime lookup is performed on untrusted user input publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cve vulnerable library fresh tgz http response freshness testing library home page a href path to dependency file package json path to vulnerable library node modules fresh package json dependency hierarchy express tgz root library x fresh tgz vulnerable library found in head commit a href found in base branch main vulnerability details fresh is a module used by the express js framework for http response freshness testing it is vulnerable to a regular expression denial of service when it is passed specially crafted input to parse this causes the event loop to be blocked causing a denial of service condition publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution fresh cve vulnerable library send tgz better streaming static file server with range and conditional get support library home page a href path to dependency file package json path to vulnerable library node modules send package json dependency hierarchy express tgz root library x send tgz vulnerable library found in head commit a href found in base branch main vulnerability details visionmedia send before for node js uses a partial comparison for verifying whether a directory is within the document root which allows remote attackers to access restricted directories as demonstrated using public restricted under a public directory publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution send direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library connect tgz high performance middleware framework library home page a href path to dependency file package json path to vulnerable library node modules connect package json dependency hierarchy express tgz root library x connect tgz vulnerable library found in head commit a href found in base branch main vulnerability details node connect before has xss in the sencha labs connect middleware publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution connect direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library express tgz sinatra inspired web development framework library home page a href path to dependency file package json path to vulnerable library node modules express package json dependency hierarchy x express tgz vulnerable library found in head commit a href found in base branch main vulnerability details the express web framework before and x before for node js does not provide a charset field in http content type headers in level responses which might allow remote attackers to conduct cross site scripting xss attacks via characters in a non standard encoding publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution express express direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue ws vulnerable library connect tgz high performance middleware framework library home page a href path to dependency file package json path to vulnerable library node modules connect package json dependency hierarchy express tgz root library x connect tgz vulnerable library found in head commit a href found in base branch main vulnerability details the methodoverride let the http post to override the method of the request with the value of the post key or with the header which allows xss attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution connect direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library connect tgz high performance middleware framework library home page a href path to dependency file package json path to vulnerable library node modules connect package json dependency hierarchy express tgz root library x connect tgz vulnerable library found in head commit a href found in base branch main vulnerability details node connects before has cross site scripting in sencha labs connect middleware vulnerability due to incomplete fix for cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution connect direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library connect tgz high performance middleware framework library home page a href path to dependency file package json path to vulnerable library node modules connect package json dependency hierarchy express tgz root library x connect tgz vulnerable library found in head commit a href found in base branch main vulnerability details connect node module before suffers from a cross site scripting xss vulnerability due to a lack of validation of file in directory js middleware publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution connect direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library send tgz better streaming static file server with range and conditional get support library home page a href path to dependency file package json path to vulnerable library node modules send package json dependency hierarchy express tgz root library x send tgz vulnerable library found in head commit a href found in base branch main vulnerability details the send package before for node js allows attackers to obtain the root path via unspecified vectors publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution send direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library qs tgz querystring parser library home page a href path to dependency file package json path to vulnerable library node modules qs package json dependency hierarchy express tgz root library connect tgz x qs tgz vulnerable library found in head commit a href found in base branch main vulnerability details the qs module before in node js does not call the compact function for array data which allows remote attackers to cause a denial of service memory consumption by using a large index value to create a sparse array publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution qs direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue cve vulnerable library cookie signature tgz sign and unsign cookies library home page a href path to dependency file package json path to vulnerable library node modules cookie signature package json dependency hierarchy express tgz root library x cookie signature tgz vulnerable library found in head commit a href found in base branch main vulnerability details node cookie signature before is affected by a timing attack due to the type of comparison used publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cookie signature cookie signature direct dependency fix resolution express rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
18,878
| 24,814,287,706
|
IssuesEvent
|
2022-10-25 11:59:10
|
zotero/zotero
|
https://api.github.com/repos/zotero/zotero
|
closed
|
Wrong given name disambiguation after editing author
|
Word Processor Integration Bug P2
|
https://forums.zotero.org/discussion/comment/419688/#Comment_419688
I can reproduce this. Restarting Zotero fixes it.
This probably is contributing to a lot of the confusion we see around disambiguation…
|
1.0
|
Wrong given name disambiguation after editing author - https://forums.zotero.org/discussion/comment/419688/#Comment_419688
I can reproduce this. Restarting Zotero fixes it.
This probably is contributing to a lot of the confusion we see around disambiguation…
|
process
|
wrong given name disambiguation after editing author i can reproduce this restarting zotero fixes it this probably is contributing to a lot of the confusion we see around disambiguation…
| 1
|
21,047
| 27,992,171,875
|
IssuesEvent
|
2023-03-27 05:17:33
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[attributeprocessor] Support applying actions to only a sample percentage of traces/metrics/logs
|
Stale processor/attributes closed as inactive
|
**Is your feature request related to a problem? Please describe.**
We would like to be able to apply an insert/update/upsert action to only a sample percentage of traces/metrics/logs in order to curb attribute bloating on all telemetry.
**Describe the solution you'd like**
To the Attribute Processor and Resource Processor, add the ability to configure the percent probability that a piece of telemetry is sampled such that all the actions defined for the processor are applied only to those that were sampled.
An example configuration would look like
```yaml
processors:
attributes:
actions:
- action: insert
key: source.name
value: "my source"
- action: insert
key: source.id
value: 1234
sampling_percentage: 15.5
```
In this case 15.5% of traces/metrics/logs have the attributes source.name and source.id inserted.
We have a custom processor that implements this feature which we are willing to contribute.
|
1.0
|
[attributeprocessor] Support applying actions to only a sample percentage of traces/metrics/logs - **Is your feature request related to a problem? Please describe.**
We would like to be able to apply an insert/update/upsert action to only a sample percentage of traces/metrics/logs in order to curb attribute bloating on all telemetry.
**Describe the solution you'd like**
To the Attribute Processor and Resource Processor, add the ability to configure the percent probability that a piece of telemetry is sampled such that all the actions defined for the processor are applied only to those that were sampled.
An example configuration would look like
```yaml
processors:
attributes:
actions:
- action: insert
key: source.name
value: "my source"
- action: insert
key: source.id
value: 1234
sampling_percentage: 15.5
```
In this case 15.5% of traces/metrics/logs have the attributes source.name and source.id inserted.
We have a custom processor that implements this feature which we are willing to contribute.
|
process
|
support applying actions to only a sample percentage of traces metrics logs is your feature request related to a problem please describe we would like to be able to apply an insert update upsert action to only a sample percentage of traces metrics logs in order to curb attribute bloating on all telemetry describe the solution you d like to the attribute processor and resource processor add the ability to configure the percent probability that a piece of telemetry is sampled such that all the actions defined for the processor are applied only to those that were sampled an example configuration would look like yaml processors attributes actions action insert key source name value my source action insert key source id value sampling percentage in this case of traces metrics logs have the attributes source name and source id inserted we have a custom processor that implements this feature which we are willing to contribute
| 1
|
359,884
| 10,682,142,153
|
IssuesEvent
|
2019-10-22 03:59:07
|
wso2/product-apim
|
https://api.github.com/repos/wso2/product-apim
|
closed
|
[Publisher/API-Product] Error when creating an API-product
|
3.0.0 Priority/High Resolution/Fixed Severity/Critical
|
When creating a product from an API resources which has parameters those parameters are not added to the API-Products resources.
|
1.0
|
[Publisher/API-Product] Error when creating an API-product - When creating a product from an API resources which has parameters those parameters are not added to the API-Products resources.
|
non_process
|
error when creating an api product when creating a product from an api resources which has parameters those parameters are not added to the api products resources
| 0
|
220,096
| 16,887,492,630
|
IssuesEvent
|
2021-06-23 03:37:53
|
aniketmaurya/chitra
|
https://api.github.com/repos/aniketmaurya/chitra
|
opened
|
Update README with Chitra bounding Box
|
documentation
|
#### Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
#### Describe the solution you'd like
A clear and concise description of what you want to happen.
#### Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
#### Additional context
Add any other context or screenshots about the feature request here.
|
1.0
|
Update README with Chitra bounding Box - #### Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
#### Describe the solution you'd like
A clear and concise description of what you want to happen.
#### Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
#### Additional context
Add any other context or screenshots about the feature request here.
|
non_process
|
update readme with chitra bounding box is your feature request related to a problem please describe a clear and concise description of what the problem is ex i m always frustrated when describe the solution you d like a clear and concise description of what you want to happen describe alternatives you ve considered a clear and concise description of any alternative solutions or features you ve considered additional context add any other context or screenshots about the feature request here
| 0
|
9,743
| 12,733,655,038
|
IssuesEvent
|
2020-06-25 12:41:09
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
closed
|
Improve sqlite error messages
|
bug/2-confirmed kind/bug process/next-milestone team/typescript
|
Also the sqlite error messages are not the best, they are not colorized and it's either kind of hidden or doesn't tell you what went wrong in details.
First one
<img width="808" alt="Screen Shot 2020-04-24 at 15 19 24" src="https://user-images.githubusercontent.com/1328733/80221817-1d8f8880-8646-11ea-8b10-11a6fb087ca7.png">
Second one
<img width="734" alt="Screen Shot 2020-04-24 at 15 59 16" src="https://user-images.githubusercontent.com/1328733/80221846-27b18700-8646-11ea-8d59-e967b213d367.png">
_Originally posted by @Jolg42 in https://github.com/prisma/prisma/issues/2295#issuecomment-619036556_
|
1.0
|
Improve sqlite error messages - Also the sqlite error messages are not the best, they are not colorized and it's either kind of hidden or doesn't tell you what went wrong in details.
First one
<img width="808" alt="Screen Shot 2020-04-24 at 15 19 24" src="https://user-images.githubusercontent.com/1328733/80221817-1d8f8880-8646-11ea-8b10-11a6fb087ca7.png">
Second one
<img width="734" alt="Screen Shot 2020-04-24 at 15 59 16" src="https://user-images.githubusercontent.com/1328733/80221846-27b18700-8646-11ea-8d59-e967b213d367.png">
_Originally posted by @Jolg42 in https://github.com/prisma/prisma/issues/2295#issuecomment-619036556_
|
process
|
improve sqlite error messages also the sqlite error messages are not the best they are not colorized and it s either kind of hidden or doesn t tell you what went wrong in details first one img width alt screen shot at src second one img width alt screen shot at src originally posted by in
| 1
|
763,888
| 26,777,269,684
|
IssuesEvent
|
2023-01-31 18:06:28
|
OpenNebula/one
|
https://api.github.com/repos/OpenNebula/one
|
closed
|
Add option to upload files in Sunstone support integration
|
Status: Pending Category: Sunstone Type: Feature Sponsored Priority: Normal Category: FireEdge
|
**Description**
Currently in Sunstone there is no option to upload a file using the support integration, you have to go to the support portal and upload there.
**Use case**
Upload files using the integration.
**Interface Changes**
Sunstone.
<!--////////////////////////////////////////////-->
<!-- THIS SECTION IS FOR THE DEVELOPMENT TEAM -->
<!-- BOTH FOR BUGS AND ENHANCEMENT REQUESTS -->
<!-- PROGRESS WILL BE REFLECTED HERE -->
<!--////////////////////////////////////////////-->
## Progress Status
- [ ] Branch created
- [ ] Code committed to development branch
- [ ] Testing - QA
- [ ] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
|
1.0
|
Add option to upload files in Sunstone support integration - **Description**
Currently in Sunstone there is no option to upload a file using the support integration, you have to go to the support portal and upload there.
**Use case**
Upload files using the integration.
**Interface Changes**
Sunstone.
<!--////////////////////////////////////////////-->
<!-- THIS SECTION IS FOR THE DEVELOPMENT TEAM -->
<!-- BOTH FOR BUGS AND ENHANCEMENT REQUESTS -->
<!-- PROGRESS WILL BE REFLECTED HERE -->
<!--////////////////////////////////////////////-->
## Progress Status
- [ ] Branch created
- [ ] Code committed to development branch
- [ ] Testing - QA
- [ ] Documentation
- [ ] Release notes - resolved issues, compatibility, known issues
- [ ] Code committed to upstream release/hotfix branches
- [ ] Documentation committed to upstream release/hotfix branches
|
non_process
|
add option to upload files in sunstone support integration description currently in sunstone there is no option to upload a file using the support integration you have to go to the support portal and upload there use case upload files using the integration interface changes sunstone progress status branch created code committed to development branch testing qa documentation release notes resolved issues compatibility known issues code committed to upstream release hotfix branches documentation committed to upstream release hotfix branches
| 0
|
9,595
| 12,543,164,600
|
IssuesEvent
|
2020-06-05 15:06:43
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Flaky internal test: Firefox "expected '<area>' to be 'focused'"
|
browser: firefox process: flaky test stage: ready for work type: chore
|
### Current behavior:
I see this test fail occasionally, passes if I rerun.
**CircleCI failure**: https://circleci.com/gh/cypress-io/cypress/327711
**Test Code**: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/e2e/focus_blur_spec.js#L672:L672

```
AssertionError: Timed out retrying: expected '<area>' to be 'focused'
@cypress:///./cypress/integration/e2e/focus_blur_spec.js:500:26
getRet@cypress:///../driver/src/cy/commands/connectors.js:144:20
tryCatcher@cypress:////root/cypress/node_modules/bluebird/js/release/util.js:16:23
module.exports/Promise.try@cypress:////root/cypress/node_modules/bluebird/js/release/method.js:39:29
thenFn@cypress:///../driver/src/cy/commands/connectors.js:162:20
then@cypress:///../driver/src/cy/commands/connectors.js:601:21
wrapByType/<@cypress:///../driver/src/cypress/cy.js:944:21
runCommand/<@cypress:///../driver/src/cypress/cy.js:388:15
tryCatcher@cypress:////root/cypress/node_modules/bluebird/js/release/util.js:16:23
module.exports/Promise.prototype._settlePromiseFromHandler@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:512:31
module.exports/Promise.prototype._settlePromise@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:569:18
module.exports/Promise.prototype._settlePromiseCtx@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:606:10
_drainQueueStep@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:142:12
_drainQueue@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:131:24
Async.prototype._drainQueues@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:147:16
Async/this.drainQueues@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:17:14
```
### Versions
4.5.0
|
1.0
|
Flaky internal test: Firefox "expected '<area>' to be 'focused'" - ### Current behavior:
I see this test fail occasionally, passes if I rerun.
**CircleCI failure**: https://circleci.com/gh/cypress-io/cypress/327711
**Test Code**: https://github.com/cypress-io/cypress/blob/develop/packages/driver/test/cypress/integration/e2e/focus_blur_spec.js#L672:L672

```
AssertionError: Timed out retrying: expected '<area>' to be 'focused'
@cypress:///./cypress/integration/e2e/focus_blur_spec.js:500:26
getRet@cypress:///../driver/src/cy/commands/connectors.js:144:20
tryCatcher@cypress:////root/cypress/node_modules/bluebird/js/release/util.js:16:23
module.exports/Promise.try@cypress:////root/cypress/node_modules/bluebird/js/release/method.js:39:29
thenFn@cypress:///../driver/src/cy/commands/connectors.js:162:20
then@cypress:///../driver/src/cy/commands/connectors.js:601:21
wrapByType/<@cypress:///../driver/src/cypress/cy.js:944:21
runCommand/<@cypress:///../driver/src/cypress/cy.js:388:15
tryCatcher@cypress:////root/cypress/node_modules/bluebird/js/release/util.js:16:23
module.exports/Promise.prototype._settlePromiseFromHandler@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:512:31
module.exports/Promise.prototype._settlePromise@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:569:18
module.exports/Promise.prototype._settlePromiseCtx@cypress:////root/cypress/node_modules/bluebird/js/release/promise.js:606:10
_drainQueueStep@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:142:12
_drainQueue@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:131:24
Async.prototype._drainQueues@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:147:16
Async/this.drainQueues@cypress:////root/cypress/node_modules/bluebird/js/release/async.js:17:14
```
### Versions
4.5.0
|
process
|
flaky internal test firefox expected to be focused current behavior i see this test fail occasionally passes if i rerun circleci failure test code assertionerror timed out retrying expected to be focused cypress cypress integration focus blur spec js getret cypress driver src cy commands connectors js trycatcher cypress root cypress node modules bluebird js release util js module exports promise try cypress root cypress node modules bluebird js release method js thenfn cypress driver src cy commands connectors js then cypress driver src cy commands connectors js wrapbytype cypress driver src cypress cy js runcommand cypress driver src cypress cy js trycatcher cypress root cypress node modules bluebird js release util js module exports promise prototype settlepromisefromhandler cypress root cypress node modules bluebird js release promise js module exports promise prototype settlepromise cypress root cypress node modules bluebird js release promise js module exports promise prototype settlepromisectx cypress root cypress node modules bluebird js release promise js drainqueuestep cypress root cypress node modules bluebird js release async js drainqueue cypress root cypress node modules bluebird js release async js async prototype drainqueues cypress root cypress node modules bluebird js release async js async this drainqueues cypress root cypress node modules bluebird js release async js versions
| 1
|
477,831
| 13,768,938,848
|
IssuesEvent
|
2020-10-07 17:50:30
|
code4romania/mon-vot-android-kotlin
|
https://api.github.com/repos/code4romania/mon-vot-android-kotlin
|
closed
|
Load about page links from remote configs
|
autumn-2020 enhancement hacktoberfest kotlin medium-priority
|
The following links were added in firebase remote configs so we can easily update them:
- `contact_email`
- `privacy_policy_url`
When opening the about screen, attempt to load the info from remote configs and only use the values from the properties files as backup.
|
1.0
|
Load about page links from remote configs - The following links were added in firebase remote configs so we can easily update them:
- `contact_email`
- `privacy_policy_url`
When opening the about screen, attempt to load the info from remote configs and only use the values from the properties files as backup.
|
non_process
|
load about page links from remote configs the following links were added in firebase remote configs so we can easily update them contact email privacy policy url when opening the about screen attempt to load the info from remote configs and only use the values from the properties files as backup
| 0
|
274,884
| 30,188,249,331
|
IssuesEvent
|
2023-07-04 13:34:07
|
gabriel-milan/denoising-autoencoder
|
https://api.github.com/repos/gabriel-milan/denoising-autoencoder
|
opened
|
WS-2022-0072 (High) detected in tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl
|
Mend: dependency security vulnerability
|
## WS-2022-0072 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/aa/fd/993aa1333eb54d9f000863fe8ec61e41d12eb833dea51484c76c038718b5/tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/aa/fd/993aa1333eb54d9f000863fe8ec61e41d12eb833dea51484c76c038718b5/tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /training/requirements.txt</p>
<p>Path to vulnerable library: /training/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/gabriel-milan/denoising-autoencoder/commit/ee5b7840072a1fc880b0723210f781f0b23412df">ee5b7840072a1fc880b0723210f781f0b23412df</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Improper Validation of Integrity Check Value in TensorFlow
<p>Publish Date: 2022-02-10
<p>URL: <a href=https://github.com/tensorflow/tensorflow/commit/61bf91e768173b001d56923600b40d9a95a04ad5>WS-2022-0072</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-43q8-3fv7-pr5x">https://github.com/advisories/GHSA-43q8-3fv7-pr5x</a></p>
<p>Release Date: 2022-02-10</p>
<p>Fix Resolution: 2.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2022-0072 (High) detected in tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl - ## WS-2022-0072 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/aa/fd/993aa1333eb54d9f000863fe8ec61e41d12eb833dea51484c76c038718b5/tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/aa/fd/993aa1333eb54d9f000863fe8ec61e41d12eb833dea51484c76c038718b5/tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: /training/requirements.txt</p>
<p>Path to vulnerable library: /training/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-2.5.0-cp37-cp37m-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/gabriel-milan/denoising-autoencoder/commit/ee5b7840072a1fc880b0723210f781f0b23412df">ee5b7840072a1fc880b0723210f781f0b23412df</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Improper Validation of Integrity Check Value in TensorFlow
<p>Publish Date: 2022-02-10
<p>URL: <a href=https://github.com/tensorflow/tensorflow/commit/61bf91e768173b001d56923600b40d9a95a04ad5>WS-2022-0072</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-43q8-3fv7-pr5x">https://github.com/advisories/GHSA-43q8-3fv7-pr5x</a></p>
<p>Release Date: 2022-02-10</p>
<p>Fix Resolution: 2.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in tensorflow whl ws high severity vulnerability vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file training requirements txt path to vulnerable library training requirements txt dependency hierarchy x tensorflow whl vulnerable library found in head commit a href found in base branch master vulnerability details improper validation of integrity check value in tensorflow publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
4,480
| 7,343,519,183
|
IssuesEvent
|
2018-03-07 11:39:29
|
fablabbcn/fablabs.io
|
https://api.github.com/repos/fablabbcn/fablabs.io
|
closed
|
Fablab is unable to add location information when applying for a new laboratory
|
Approval Process
|
When we apply for a new laboratory, the laboratory map is not shown in the application list.
I wonder if this is a map interface in addition to the problem?
What can we do to solve this problem?
|
1.0
|
Fablab is unable to add location information when applying for a new laboratory - When we apply for a new laboratory, the laboratory map is not shown in the application list.
I wonder if this is a map interface in addition to the problem?
What can we do to solve this problem?
|
process
|
fablab is unable to add location information when applying for a new laboratory when we apply for a new laboratory the laboratory map is not shown in the application list i wonder if this is a map interface in addition to the problem what can we do to solve this problem?
| 1
|
12,245
| 14,744,084,793
|
IssuesEvent
|
2021-01-07 14:49:29
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Winnipeg - Billed for Holiday Charges on a Terminated Account
|
anc-process anp-urgent ant-bug ant-child/secondary
|
In GitLab by @kdjstudios on Dec 30, 2019, 15:55
**Submitted by:** "Elizabeth Fed" <elizabeth.fed@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-12-30-24545/conversation
**Server:** Internal
**Client/Site:** Winnipeg
**Account:** 86465
**Issue:**
Site: 075 Winnipeg
Prairie HVAC & Refrigeration 075-86465 terminated the service on 04/24/2019 however received an invoice in which they were charged for holiday fee of $15.00 on invoice 075-16249 01/01/2020.
|
1.0
|
Winnipeg - Billed for Holiday Charges on a Terminated Account - In GitLab by @kdjstudios on Dec 30, 2019, 15:55
**Submitted by:** "Elizabeth Fed" <elizabeth.fed@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-12-30-24545/conversation
**Server:** Internal
**Client/Site:** Winnipeg
**Account:** 86465
**Issue:**
Site: 075 Winnipeg
Prairie HVAC & Refrigeration 075-86465 terminated the service on 04/24/2019 however received an invoice in which they were charged for holiday fee of $15.00 on invoice 075-16249 01/01/2020.
|
process
|
winnipeg billed for holiday charges on a terminated account in gitlab by kdjstudios on dec submitted by elizabeth fed helpdesk server internal client site winnipeg account issue site winnipeg prairie hvac refrigeration terminated the service on however received an invoice in which they were charged for holiday fee of on invoice
| 1
|
7,528
| 10,601,175,791
|
IssuesEvent
|
2019-10-10 11:45:43
|
code4romania/expert-consultation-api
|
https://api.github.com/repos/code4romania/expert-consultation-api
|
opened
|
[Documents] Modify comment feature to take into account the new document format
|
document processing documents enhancement help wanted java spring
|
The Comment feature needs to be adapted to the changes to the document structure.
The article, chapter and document controllers need to be changed into a single controller for DocumentSection / DocumentNode comments.
The comment data model needs to be updated.
|
1.0
|
[Documents] Modify comment feature to take into account the new document format - The Comment feature needs to be adapted to the changes to the document structure.
The article, chapter and document controllers need to be changed into a single controller for DocumentSection / DocumentNode comments.
The comment data model needs to be updated.
|
process
|
modify comment feature to take into account the new document format the comment feature needs to be adapted to the changes to the document structure the article chapter and document controllers need to be changed into a single controller for documentsection documentnode comments the comment data model needs to be updated
| 1
|
301,055
| 22,711,002,261
|
IssuesEvent
|
2022-07-05 19:21:29
|
livepeer/livepeer-studio-docs
|
https://api.github.com/repos/livepeer/livepeer-studio-docs
|
closed
|
Create Annotated Sample Applications List (About Livepeer, a.k.a. Studio 101)
|
documentation
|
- Write a table with a linked annotated list of Sample applications available in LPStudio
- Names of samples are linked to their location in git-hub
- with a description of each sample app in the table
See -- https://github.com/livepeer/livepeer-studio-docs/blob/main/docs/studio101/sampleapps101.mdx
- change the title of the document to "Sample Apps"
- Add an introductory paragraph about Sample Apps (purpose of the page, objective(s)/outcomes)
- change the toc to 'about-livepeer/sampleapps.mdx` (check with @Shih-Yu about this)
- push to created branch 11-create-annotated-applications-list...
Add "assign reviewers" capability in github and assign Paige to review.
|
1.0
|
Create Annotated Sample Applications List (About Livepeer, a.k.a. Studio 101) - - Write a table with a linked annotated list of Sample applications available in LPStudio
- Names of samples are linked to their location in git-hub
- with a description of each sample app in the table
See -- https://github.com/livepeer/livepeer-studio-docs/blob/main/docs/studio101/sampleapps101.mdx
- change the title of the document to "Sample Apps"
- Add an introductory paragraph about Sample Apps (purpose of the page, objective(s)/outcomes)
- change the toc to 'about-livepeer/sampleapps.mdx` (check with @Shih-Yu about this)
- push to created branch 11-create-annotated-applications-list...
Add "assign reviewers" capability in github and assign Paige to review.
|
non_process
|
create annotated sample applications list about livepeer a k a studio write a table with a linked annotated list of sample applications available in lpstudio names of samples are linked to their location in git hub with a description of each sample app in the table see change the title of the document to sample apps add an introductory paragraph about sample apps purpose of the page objective s outcomes change the toc to about livepeer sampleapps mdx check with shih yu about this push to created branch create annotated applications list add assign reviewers capability in github and assign paige to review
| 0
|
15,024
| 18,739,068,509
|
IssuesEvent
|
2021-11-04 11:24:09
|
ethereum/EIPs
|
https://api.github.com/repos/ethereum/EIPs
|
closed
|
EIP-1: Include RFC 2119 recommendation into EIP-1 so EIPs do not need to duplicate it
|
type: EIP1 (Process)
|
Pretty much every "seriously written" EIP duplicates that paragraph from RFC 2119.
|
1.0
|
EIP-1: Include RFC 2119 recommendation into EIP-1 so EIPs do not need to duplicate it - Pretty much every "seriously written" EIP duplicates that paragraph from RFC 2119.
|
process
|
eip include rfc recommendation into eip so eips do not need to duplicate it pretty much every seriously written eip duplicates that paragraph from rfc
| 1
|
16,014
| 20,188,225,752
|
IssuesEvent
|
2022-02-11 01:19:36
|
savitamittalmsft/WAS-SEC-TEST
|
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
|
opened
|
Configure and collect network traffic logs
|
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Networking & Connectivity Connectivity
|
<a href="https://docs.microsoft.com/azure/architecture/framework/security/monitor-identity-network#enable-network-visibility">Configure and collect network traffic logs</a>
<p><b>Why Consider This?</b></p>
NSG flow logs should be captured and analyzed to monitor security. NSG flow logging enables Traffic Analytics to gain insights into internal and external traffic flows of an application.
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Configure and collect network traffic logs. Either with NSG flow logs or with Azure Firewall logs.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview" target="_blank"><span>https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview</span></a><span /></p><p><a href="https://docs.microsoft.com/en-us/azure/firewall/logs-and-metrics" target="_blank"><span>https://docs.microsoft.com/en-us/azure/firewall/logs-and-metrics</span></a><span /></p>
|
1.0
|
Configure and collect network traffic logs - <a href="https://docs.microsoft.com/azure/architecture/framework/security/monitor-identity-network#enable-network-visibility">Configure and collect network traffic logs</a>
<p><b>Why Consider This?</b></p>
NSG flow logs should be captured and analyzed to monitor security. NSG flow logging enables Traffic Analytics to gain insights into internal and external traffic flows of an application.
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Configure and collect network traffic logs. Either with NSG flow logs or with Azure Firewall logs.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview" target="_blank"><span>https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-nsg-flow-logging-overview</span></a><span /></p><p><a href="https://docs.microsoft.com/en-us/azure/firewall/logs-and-metrics" target="_blank"><span>https://docs.microsoft.com/en-us/azure/firewall/logs-and-metrics</span></a><span /></p>
|
process
|
configure and collect network traffic logs why consider this nsg flow logs should be captured and analyzed to monitor security nsg flow logging enables traffic analytics to gain insights into internal and external traffic flows of an application context suggested actions configure and collect network traffic logs either with nsg flow logs or with azure firewall logs learn more
| 1
|
186,738
| 14,406,876,752
|
IssuesEvent
|
2020-12-03 20:55:20
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: cdc/ledger failed
|
C-test-failure O-roachtest O-robot branch-release-20.2 release-blocker
|
[(roachtest).cdc/ledger failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2463364&tab=buildLog) on [release-20.2@8729c06ed8e3baba67ab5651588b00e015e696ee](https://github.com/cockroachdb/cockroach/commits/8729c06ed8e3baba67ab5651588b00e015e696ee):
```
The test failed on branch=release-20.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/cdc/ledger/run_1
cdc.go:880,cdc.go:200,cdc.go:570,test_runner.go:755: max latency was more than allowed: 1m15.158335069s vs 1m0s
```
<details><summary>More</summary><p>
Artifacts: [/cdc/ledger](https://teamcity.cockroachdb.com/viewLog.html?buildId=2463364&tab=artifacts#/cdc/ledger)
Related:
- #52123 roachtest: cdc/ledger/rangefeed=true failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Acdc%2Fledger.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: cdc/ledger failed - [(roachtest).cdc/ledger failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2463364&tab=buildLog) on [release-20.2@8729c06ed8e3baba67ab5651588b00e015e696ee](https://github.com/cockroachdb/cockroach/commits/8729c06ed8e3baba67ab5651588b00e015e696ee):
```
The test failed on branch=release-20.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/cdc/ledger/run_1
cdc.go:880,cdc.go:200,cdc.go:570,test_runner.go:755: max latency was more than allowed: 1m15.158335069s vs 1m0s
```
<details><summary>More</summary><p>
Artifacts: [/cdc/ledger](https://teamcity.cockroachdb.com/viewLog.html?buildId=2463364&tab=artifacts#/cdc/ledger)
Related:
- #52123 roachtest: cdc/ledger/rangefeed=true failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Acdc%2Fledger.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
non_process
|
roachtest cdc ledger failed on the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts cdc ledger run cdc go cdc go cdc go test runner go max latency was more than allowed vs more artifacts related roachtest cdc ledger rangefeed true failed powered by
| 0
|
19,194
| 25,321,234,348
|
IssuesEvent
|
2022-11-18 04:13:17
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
Inject OTel tracestate when power-of-two probability sampling
|
processor/probabilisticsampler
|
**Proposal**
I propose to modify the [probabilistic sampling processor](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/processor/probabilisticsamplerprocessor) to emit an OTel tracestate corresponding with the sampling probability in use, which is well defined in the current specification when the sampling probability is a power of two.
Specifically, if the sampling probability is a power of two such that `Log2(probability) == -X` where X is an integer, then the corresponding Span's tracestate should be extended with an entry `ot=p:X`. Here, "p" is the base-2 logarithm of "adjusted count".
When the Span already has a p-value, probabilities multiply. If the span does not have a p-value, which formally means "unknown" adjusted count, **we will assume the count is 1 span** corresponding with probability=1 (i.e., `ot=p:0`).
If the span already has an `ot=p:Y` property, the correct output is `ot=p:Z` for `Z=X+Y` (multiplying probabilities == addition inside Log2).
For example, if performing 50% sampling then we are multiplying adjusted counts by 2. Spans with no adjusted count on arrival will depart with p-value 1, and Spans with an adjusted count on arrival will depart with p-value greater by 1.
Other `ot=` properties than `p`, such as `r`, SHOULD pass through unmodified.
**Additional context**
The tracestate fields used here are defined in
https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/tracestate-probability-sampling.md
This specification defines support only for power-of-two adjusted counts. Support for non-power-of-two adjusted counts would require more specification work, but is certainly possible. The proposal here is to leave the processor logic here unmodified, and only to extend `p-value` when the probability is a power of two. Some documentation will be required to warn users that arbitrary composition of these processors having mixed power-of-two and non-power-of-two probabilities will not behave correctly. Users who want this should user powers-of-two everywhere.
The type of sampling being performed here is considered "after-the-fact", since we are interpreting and mutating the Span data of finished spans. Trace context uses an r-value to describe a randomness value for contexts in flight to use making consistent decisions, whereas this processor needs only to interpret and set p-value.
|
1.0
|
Inject OTel tracestate when power-of-two probability sampling - **Proposal**
I propose to modify the [probabilistic sampling processor](https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/processor/probabilisticsamplerprocessor) to emit an OTel tracestate corresponding with the sampling probability in use, which is well defined in the current specification when the sampling probability is a power of two.
Specifically, if the sampling probability is a power of two such that `Log2(probability) == -X` where X is an integer, then the corresponding Span's tracestate should be extended with an entry `ot=p:X`. Here, "p" is the base-2 logarithm of "adjusted count".
When the Span already has a p-value, probabilities multiply. If the span does not have a p-value, which formally means "unknown" adjusted count, **we will assume the count is 1 span** corresponding with probability=1 (i.e., `ot=p:0`).
If the span already has an `ot=p:Y` property, the correct output is `ot=p:Z` for `Z=X+Y` (multiplying probabilities == addition inside Log2).
For example, if performing 50% sampling then we are multiplying adjusted counts by 2. Spans with no adjusted count on arrival will depart with p-value 1, and Spans with an adjusted count on arrival will depart with p-value greater by 1.
Other `ot=` properties than `p`, such as `r`, SHOULD pass through unmodified.
**Additional context**
The tracestate fields used here are defined in
https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/tracestate-probability-sampling.md
This specification defines support only for power-of-two adjusted counts. Support for non-power-of-two adjusted counts would require more specification work, but is certainly possible. The proposal here is to leave the processor logic here unmodified, and only to extend `p-value` when the probability is a power of two. Some documentation will be required to warn users that arbitrary composition of these processors having mixed power-of-two and non-power-of-two probabilities will not behave correctly. Users who want this should user powers-of-two everywhere.
The type of sampling being performed here is considered "after-the-fact", since we are interpreting and mutating the Span data of finished spans. Trace context uses an r-value to describe a randomness value for contexts in flight to use making consistent decisions, whereas this processor needs only to interpret and set p-value.
|
process
|
inject otel tracestate when power of two probability sampling proposal i propose to modify the to emit an otel tracestate corresponding with the sampling probability in use which is well defined in the current specification when the sampling probability is a power of two specifically if the sampling probability is a power of two such that probability x where x is an integer then the corresponding span s tracestate should be extended with an entry ot p x here p is the base logarithm of adjusted count when the span already has a p value probabilities multiply if the span does not have a p value which formally means unknown adjusted count we will assume the count is span corresponding with probability i e ot p if the span already has an ot p y property the correct output is ot p z for z x y multiplying probabilities addition inside for example if performing sampling then we are multiplying adjusted counts by spans with no adjusted count on arrival will depart with p value and spans with an adjusted count on arrival will depart with p value greater by other ot properties than p such as r should pass through unmodified additional context the tracestate fields used here are defined in this specification defines support only for power of two adjusted counts support for non power of two adjusted counts would require more specification work but is certainly possible the proposal here is to leave the processor logic here unmodified and only to extend p value when the probability is a power of two some documentation will be required to warn users that arbitrary composition of these processors having mixed power of two and non power of two probabilities will not behave correctly users who want this should user powers of two everywhere the type of sampling being performed here is considered after the fact since we are interpreting and mutating the span data of finished spans trace context uses an r value to describe a randomness value for contexts in flight to use making consistent decisions whereas this processor needs only to interpret and set p value
| 1
|
300,923
| 26,003,097,407
|
IssuesEvent
|
2022-12-20 16:50:07
|
pmd/pmd
|
https://api.github.com/repos/pmd/pmd
|
closed
|
[test] Tests that change the logging level do not work
|
a:bug in:testing
|
<!-- Please, prefix the report title with the language it applies to within brackets, such as [java] or [apex].
If not specific to a language, you can use [core]. -->
**Affects PMD Version:** 7.0.x - because of SLF4J #896
**Description:**
There are a couple of tests that change the logging level in the pmd 7 branch, to test the `--debug` cli switch for instance:
https://github.com/pmd/pmd/blob/1f43af7d831e5b2d63b13d5b2c8f22916609379a/pmd-cli/src/test/java/net/sourceforge/pmd/cli/PmdCliTest.java#L174
I'm afraid this only works by chance... When the trace level is enabled we are supposed to see log statements like `Adding file ...` from the file collector. If you add an assert like `assertThat(log, containsString("Adding file"));` to the linked test you will see that it fails, but only if you run the entire test class (not just the given test).
The problem is that when we reinitialize the log level in [Slf4jSimpleConfiguration](https://github.com/pmd/pmd/blob/c0536d5cb99445d5e4aa59b50c8630aeaaec1a32/pmd-core/src/main/java/net/sourceforge/pmd/internal/Slf4jSimpleConfiguration.java#L42), that doesn't change the log level of loggers that have already been created. This reset routine calls two methods via reflection:
- `org.slf4j.impl.SimpleLogger#init`: this parses system properties and changes the default logging level of the logger factory, meaning, *new* loggers will use that log level
- `org.slf4j.impl.SimpleLoggerFactory#reset`: this clears the `Map<String, Logger>` that caches loggers that have already been created. So the next time eg `LoggerFactory.getLogger(FileCollector.class)` is called, it will get a new logger with the correct new logging level.
The problem is that we create logger instances only once, and then store them in static final fields:
https://github.com/pmd/pmd/blob/dd96116c3150730101db7ee9065f6a6709595576/pmd-core/src/main/java/net/sourceforge/pmd/lang/document/FileCollector.java#L52
The call to `LoggerFactory` is executed only once in a given VM, and the created logger will use whatever default log level is defined at that point. But if you later change the log level with the reset method, the logger of `FileCollector` will be the same in that static field, and will still use the old log level.
This makes it significantly harder to test the behavior of the CLI with this debug flag... Tests are order dependent.
For instance in this test class:
https://github.com/pmd/pmd/blob/29d31ec8d22b4fedaf72c8e148ee345bd5c90e87/pmd-java/src/test/java/net/sourceforge/pmd/cli/CLITest.java#L68-L70
This test apparently succeeds, but it's only because by chance, the first test that gets executed in this class uses the `--debug` flag. All other tests then actually observe TRACE and DEBUG level log statements, even those that don't use the `--debug` flag. For instance this is the log of the `testWrongRuleset` test in this class, which doesn't use the debug flag:
```
[main] INFO net.sourceforge.pmd.PMD - Log level is at INFO
[main] DEBUG net.sourceforge.pmd.internal.util.FileCollectionUtil - Adding directory src/test/resources/net/sourceforge/pmd/cli.
[main] TRACE net.sourceforge.pmd.lang.document.FileCollector - Adding file /home/clem/Documents/Git/pmd/pmd-java/src/test/resources/net/sourceforge/pmd/cli/EmptyIfStatement.java (lang: java 19)
```
We announce that the log level is INFO, even though the level of other internal loggers actually stayed at TRACE.
#### Possible solutions
1. Maybe we should never store loggers in final fields, but always call `LoggerFactory.getLogger`? This would work but would probably be bad for performance, because `getLogger` performs a lookup in a ConcurrentMap.
1. Maybe we can ask the test runner to fork a JVM for each test?
1. A hacky fix: instead of calling `SimpleLoggerFactory#reset` and clearing the logger cache, we could
- fetch the map field by reflection
- iterate through all the loggers currently alive
- set their log level field by reflection
This third item seems to me like the easiest thing to do... @adangel wdyt?
|
1.0
|
[test] Tests that change the logging level do not work - <!-- Please, prefix the report title with the language it applies to within brackets, such as [java] or [apex].
If not specific to a language, you can use [core]. -->
**Affects PMD Version:** 7.0.x - because of SLF4J #896
**Description:**
There are a couple of tests that change the logging level in the pmd 7 branch, to test the `--debug` cli switch for instance:
https://github.com/pmd/pmd/blob/1f43af7d831e5b2d63b13d5b2c8f22916609379a/pmd-cli/src/test/java/net/sourceforge/pmd/cli/PmdCliTest.java#L174
I'm afraid this only works by chance... When the trace level is enabled we are supposed to see log statements like `Adding file ...` from the file collector. If you add an assert like `assertThat(log, containsString("Adding file"));` to the linked test you will see that it fails, but only if you run the entire test class (not just the given test).
The problem is that when we reinitialize the log level in [Slf4jSimpleConfiguration](https://github.com/pmd/pmd/blob/c0536d5cb99445d5e4aa59b50c8630aeaaec1a32/pmd-core/src/main/java/net/sourceforge/pmd/internal/Slf4jSimpleConfiguration.java#L42), that doesn't change the log level of loggers that have already been created. This reset routine calls two methods via reflection:
- `org.slf4j.impl.SimpleLogger#init`: this parses system properties and changes the default logging level of the logger factory, meaning, *new* loggers will use that log level
- `org.slf4j.impl.SimpleLoggerFactory#reset`: this clears the `Map<String, Logger>` that caches loggers that have already been created. So the next time eg `LoggerFactory.getLogger(FileCollector.class)` is called, it will get a new logger with the correct new logging level.
The problem is that we create logger instances only once, and then store them in static final fields:
https://github.com/pmd/pmd/blob/dd96116c3150730101db7ee9065f6a6709595576/pmd-core/src/main/java/net/sourceforge/pmd/lang/document/FileCollector.java#L52
The call to `LoggerFactory` is executed only once in a given VM, and the created logger will use whatever default log level is defined at that point. But if you later change the log level with the reset method, the logger of `FileCollector` will be the same in that static field, and will still use the old log level.
This makes it significantly harder to test the behavior of the CLI with this debug flag... Tests are order dependent.
For instance in this test class:
https://github.com/pmd/pmd/blob/29d31ec8d22b4fedaf72c8e148ee345bd5c90e87/pmd-java/src/test/java/net/sourceforge/pmd/cli/CLITest.java#L68-L70
This test apparently succeeds, but it's only because by chance, the first test that gets executed in this class uses the `--debug` flag. All other tests then actually observe TRACE and DEBUG level log statements, even those that don't use the `--debug` flag. For instance this is the log of the `testWrongRuleset` test in this class, which doesn't use the debug flag:
```
[main] INFO net.sourceforge.pmd.PMD - Log level is at INFO
[main] DEBUG net.sourceforge.pmd.internal.util.FileCollectionUtil - Adding directory src/test/resources/net/sourceforge/pmd/cli.
[main] TRACE net.sourceforge.pmd.lang.document.FileCollector - Adding file /home/clem/Documents/Git/pmd/pmd-java/src/test/resources/net/sourceforge/pmd/cli/EmptyIfStatement.java (lang: java 19)
```
We announce that the log level is INFO, even though the level of other internal loggers actually stayed at TRACE.
#### Possible solutions
1. Maybe we should never store loggers in final fields, but always call `LoggerFactory.getLogger`? This would work but would probably be bad for performance, because `getLogger` performs a lookup in a ConcurrentMap.
1. Maybe we can ask the test runner to fork a JVM for each test?
1. A hacky fix: instead of calling `SimpleLoggerFactory#reset` and clearing the logger cache, we could
- fetch the map field by reflection
- iterate through all the loggers currently alive
- set their log level field by reflection
This third item seems to me like the easiest thing to do... @adangel wdyt?
|
non_process
|
tests that change the logging level do not work please prefix the report title with the language it applies to within brackets such as or if not specific to a language you can use affects pmd version x because of description there are a couple of tests that change the logging level in the pmd branch to test the debug cli switch for instance i m afraid this only works by chance when the trace level is enabled we are supposed to see log statements like adding file from the file collector if you add an assert like assertthat log containsstring adding file to the linked test you will see that it fails but only if you run the entire test class not just the given test the problem is that when we reinitialize the log level in that doesn t change the log level of loggers that have already been created this reset routine calls two methods via reflection org impl simplelogger init this parses system properties and changes the default logging level of the logger factory meaning new loggers will use that log level org impl simpleloggerfactory reset this clears the map that caches loggers that have already been created so the next time eg loggerfactory getlogger filecollector class is called it will get a new logger with the correct new logging level the problem is that we create logger instances only once and then store them in static final fields the call to loggerfactory is executed only once in a given vm and the created logger will use whatever default log level is defined at that point but if you later change the log level with the reset method the logger of filecollector will be the same in that static field and will still use the old log level this makes it significantly harder to test the behavior of the cli with this debug flag tests are order dependent for instance in this test class this test apparently succeeds but it s only because by chance the first test that gets executed in this class uses the debug flag all other tests then actually observe trace and debug level log statements even those that don t use the debug flag for instance this is the log of the testwrongruleset test in this class which doesn t use the debug flag info net sourceforge pmd pmd log level is at info debug net sourceforge pmd internal util filecollectionutil adding directory src test resources net sourceforge pmd cli trace net sourceforge pmd lang document filecollector adding file home clem documents git pmd pmd java src test resources net sourceforge pmd cli emptyifstatement java lang java we announce that the log level is info even though the level of other internal loggers actually stayed at trace possible solutions maybe we should never store loggers in final fields but always call loggerfactory getlogger this would work but would probably be bad for performance because getlogger performs a lookup in a concurrentmap maybe we can ask the test runner to fork a jvm for each test a hacky fix instead of calling simpleloggerfactory reset and clearing the logger cache we could fetch the map field by reflection iterate through all the loggers currently alive set their log level field by reflection this third item seems to me like the easiest thing to do adangel wdyt
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.